Mobile testing is harder than web testing. Two platforms with different design languages, thousands of device configurations, OS version fragmentation, network variability, and app store gatekeepers that reject builds for subtle violations. This checklist ensures comprehensive coverage across iOS and Android without testing every possible combination, which is impossible, or testing too few combinations, which misses real user issues.

Device matrix: selecting what to test on

Testing on every device is impractical. Testing on too few devices misses real-world issues. The goal is a device matrix that maximizes coverage with a manageable number of devices.

iOS device selection

Apple’s controlled ecosystem makes iOS device selection straightforward. Cover these dimensions: current iPhone generation (iPhone 15/16 series), previous generation (iPhone 14 series), oldest supported generation (based on your minimum iOS version), one iPad model if your app supports tablets, and smallest and largest screen sizes (iPhone SE vs iPhone Pro Max).

A typical iOS matrix: 4-5 devices covering 2-3 iOS versions. Run the latest iOS version on most devices and keep one device on the oldest supported version.

Android device selection

Android fragmentation requires a broader approach. Cover these dimensions: Samsung Galaxy S series (largest global market share), Samsung Galaxy A series (most popular mid-range), Google Pixel (reference Android implementation), Xiaomi or other regional champion (depending on target market), a budget device with limited RAM (2-3 GB), smallest and largest screen sizes, and oldest supported Android version.

A typical Android matrix: 6-8 devices covering 3-4 Android versions and 3-4 manufacturers. Manufacturer-specific Android modifications (Samsung One UI, Xiaomi MIUI) cause UI rendering and behavior differences that stock Android testing misses.

Cloud device farms

Supplement your physical device lab with cloud services for extended coverage. BrowserStack, AWS Device Farm, and Sauce Labs provide access to hundreds of real devices. Use cloud devices for automated regression across the extended matrix and reserve physical devices for manual exploratory testing and performance validation.

Functional testing checklist

These tests apply to both platforms. Execute on every device in your matrix.

Installation and launch. Fresh installation from store or TestFlight/internal testing track. App launches within 3 seconds on mid-range devices. First-run experience (onboarding, permissions) works correctly. App icon and splash screen render correctly at all resolutions.

Core user journeys. Map every critical path (registration, login, primary feature usage, payment, settings) and test end-to-end. Each journey must complete without errors, crashes, or unexpected behavior on every device in the matrix.

Navigation and UI. All screens render correctly across screen sizes. Navigation elements (tabs, back buttons, hamburger menus) function consistently. Landscape and portrait orientation work where supported. Safe area insets are respected on notched devices.

Input handling. Keyboard types match input fields (email keyboard for email, numeric for phone). Auto-fill and password manager integration works. Form validation provides clear error messages. Copy-paste functions correctly in all text fields.

Offline behavior. App handles loss of connectivity gracefully. Pending actions queue and sync when connection returns. Cached content displays when offline. Error messages clearly communicate offline state without crashing.

Push notifications. Notifications arrive with correct content and formatting. Tapping a notification navigates to the correct screen. Notification permissions flow follows platform guidelines. Background and foreground notification handling both work correctly.

Deep linking and app links. Universal links (iOS) and App Links (Android) open the correct in-app screen. Deferred deep links work for new installs. Link handling works from email, SMS, social media, and web browsers.

Platform-specific testing

iOS-specific checklist

App Tracking Transparency. ATT prompt appears at the correct moment. App functions correctly regardless of tracking permission choice. IDFA is not accessed before user consent. Comply with Apple’s privacy nutrition labels.

Human Interface Guidelines. Navigation follows iOS conventions (back swipe, tab bars). System controls (date pickers, action sheets) use native iOS components. Dynamic Type support for accessibility text sizing. Dark Mode support renders all screens correctly.

Background processing. Background App Refresh functions correctly. Background downloads complete as expected. Background location updates work without excessive battery drain. App resumes from background state without data loss.

Apple ecosystem integration. Handoff works between iPhone and iPad (if applicable). Widgets display correctly on home screen and lock screen. Shortcuts and Siri Suggestions integration (if implemented). Sign in with Apple functions correctly.

App Store compliance. No private API usage. No references to competing platforms. In-app purchase implementation follows StoreKit guidelines. Content ratings match actual content.

Android-specific checklist

Manufacturer-specific behavior. Samsung One UI, Xiaomi MIUI, and Huawei EMUI modify standard Android behavior. Test notification delivery on manufacturer-modified Android (aggressive battery optimization on Xiaomi and Huawei kills background processes). Verify camera and biometric APIs work across manufacturer implementations.

Android version compatibility. Test on minimum supported API level and current API level. Verify runtime permissions on Android 6.0+. Verify scoped storage on Android 10+. Verify notification channels on Android 8.0+. Verify predictive back gesture on Android 14+.

Google Play requirements. Target API level meets Google Play’s current requirement. Data safety section accurately reflects data collection. App functions correctly with all permissions denied. Content rating questionnaire accurately completed.

Hardware variability. Test on devices with different RAM amounts (2 GB, 4 GB, 8 GB). Verify behavior on devices with limited storage. Test camera functionality across different camera hardware. Verify biometric authentication on fingerprint and face unlock devices.

Automation tool selection

ToolPlatformLanguageBest For
XCTest/XCUITestiOS onlySwift/Obj-CNative iOS testing, Apple ecosystem
EspressoAndroid onlyJava/KotlinNative Android testing, fast execution
AppiumCross-platformAny (via WebDriver)Cross-platform suites, existing Selenium skills
DetoxCross-platformJavaScriptReact Native applications
MaestroCross-platformYAMLSimple flows, quick setup, no coding
Flutter DriverCross-platformDartFlutter applications

Recommendation for most teams: Use XCUITest for iOS and Espresso for Android if you have native development skills. Use Appium if you need a single cross-platform framework and have Selenium experience. Use Maestro for teams without automation experience who need quick coverage of critical paths.

Non-functional testing

Performance. App startup time under 3 seconds on mid-range devices. Screen transitions under 300ms. Scrolling at 60 fps without frame drops. Memory usage stays within platform recommendations (under 200 MB for most apps). No memory leaks over extended usage sessions.

Battery consumption. Measure battery drain during active usage and background operation. Compare against similar apps in the category. Identify and optimize high-drain operations (GPS, animations, network polling).

Network conditions. Test on WiFi, 4G, 3G, and edge connections. Test transitions between network types (WiFi to cellular). Verify behavior on high-latency connections (300ms+ round trip). Test download and upload reliability on intermittent connections.

Security. Data encrypted at rest and in transit. No sensitive data in logs. Certificate pinning implemented (if required). Biometric authentication cannot be bypassed. Session management handles token expiration correctly.

Accessibility. Screen reader (VoiceOver on iOS, TalkBack on Android) can navigate all screens. Touch targets are at least 44x44 points (iOS) or 48x48 dp (Android). Color contrast meets WCAG AA standards. All images have descriptive labels.

How ARDURA Consulting supports mobile testing

Mobile testing requires platform specialists who understand iOS and Android ecosystems deeply. Finding engineers with expertise across both platforms, plus automation skills, is one of the most challenging QA hiring scenarios.

500+ senior specialists in the ARDURA Consulting network include mobile QA engineers experienced with XCUITest, Espresso, Appium, and Detox. They have tested apps across fintech, e-commerce, healthcare, and enterprise domains.

2-week onboarding means your mobile testing team is operational before your next sprint ends. No 2-month recruitment process while your release date approaches.

40% average cost savings compared to Western European mobile QA rates. A dedicated mobile testing team of 2-3 engineers through ARDURA Consulting costs less than a single senior mobile QA engineer hired locally.

99% retention rate ensures your mobile testing specialists stay through the entire release cycle and beyond, maintaining device knowledge and test suite familiarity. With 211+ delivered projects, contact us to build your mobile testing capability.