Back to articles

Performance & Security

Google Maps SDK on Android — Implementation Deep-Dive Beyond the Tutorials

The Google Maps tutorials online stop at exactly the wrong place. They show you how to drop a map fragment into a layout, add a marker at a hardcoded location, maybe enable “my location.” Done. What they don’t teach is what happens when you actually have to build a map screen for production — one that streams 50 vehicle markers updating every 3 seconds, draws polyline routes that change as data arrives, clusters markers as the user zooms, prefetches tiles for offline regions, and doesn’t drain the battery in 15 minutes.

The reason those tutorials stop early is that the next layer down is genuinely complex. You’re managing a rendering pipeline (tiles fetched and cached at zoom levels), a viewport state machine (camera position drives what’s visible), a continuous data stream (locations and routes updating from the network), and a UI that has to stay responsive across all of it. Doing this well requires understanding how Google Maps SDK actually works internally, not just calling its public methods.

This post is the implementation deep-dive. We’ll walk through what happens from “the user opens the map screen” through tiles loading, viewport updates firing, markers rendering, and location streaming — with the implementation patterns at each layer. Examples come from a delivery tracking app because real-time vehicle markers with route polylines is the canonical Maps problem beyond ride-sharing. Future Maps cluster posts will go deeper on marker clustering, the Mapbox/Google/OSM comparison, and offline maps; this post is the foundation.


What Actually Happens When the Map Screen Opens

Forget the public API for a moment. Walk through what the Google Maps SDK does internally when a user navigates to your map screen:

  1. MapFragment or MapView attaches; SDK initialises native rendering surface (OpenGL ES context)
  2. SDK reads the camera position from saved state (or defaults to a world view)
  3. SDK computes which map tiles are needed for the current viewport and zoom level
  4. For each tile: check the disk cache; if hit, decode and queue for render. If miss, fire HTTPS request to Google’s tile servers
  5. Tiles arrive (200–1500ms first paint depending on connection); SDK composites them onto the GL surface
  6. If the camera moves (user pan/zoom or programmatic): re-compute needed tiles, repeat 4–5 for new ones
  7. Marker, polyline, and overlay layers render on top of tiles, clipped to viewport

That fourth step — tile fetching — is what your user feedback ("how does it hit continuously?") is asking about. The SDK is making continuous HTTPS requests to googleapis.com/.../mapTiles/... as the camera moves. Each tile is a 256×256 PNG (or vector tile data on newer Maps versions). At any moment, ~12–25 tiles are visible at the current zoom; pan or zoom changes that set, and the SDK fetches the difference.

The SDK caches aggressively: tiles you’ve seen before are stored on disk (up to a default cap, configurable). Pan back to a region you saw 2 minutes ago, no network hit. But the first encounter with any region requires the network, and if your app starts the user in a new region, expect 1–2 seconds of tile loading.

Implications for your app:

  • The map is genuinely network-dependent. On bad connections, expect a poor first-render experience — plan for it.
  • Tile cache is per-app, per-install. Reinstalling clears it.
  • You can preload tiles for known regions (Google’s preferred mechanism is to navigate there programmatically while the user is on a previous screen) — but there’s no public API to forcibly download a region for offline use without using Google’s newer offline maps APIs (which have their own constraints).

Setting Up the Map — What the Tutorials Get Right

The basic setup is genuinely simple. Add the SDK, get an API key, drop in a map composable.

// build.gradle.kts
implementation(“com.google.maps.android:maps-compose:6.4.0”)
implementation(“com.google.android.gms:play-services-maps:19.0.0”)
implementation(“com.google.maps.android:maps-compose-utils:6.4.0”)  // For clustering, etc.

// AndroidManifest.xml — the API key
<application>
    <meta-data
        android:name=“com.google.android.geo.API_KEY”
        android:value=“${MAPS_API_KEY}” />
</application>

API key best practice: never hardcode the key in code or check it into git. Inject via gradle.properties or the Secrets Gradle Plugin, restrict the key to your package name + signing certificate fingerprint in Google Cloud Console. Maps requests from any other app or any other key are rejected. This is the only thing that prevents an attacker from extracting your APK and using your key for their own app, billing you for their usage.

// The basic Compose map
@Composable
fun DeliveryMap(modifier: Modifier = Modifier) {
    val warehouseLocation = LatLng(37.7749, -122.4194)
    val cameraPositionState = rememberCameraPositionState {
        position = CameraPosition.fromLatLngZoom(warehouseLocation, 12f)
        // 12f zoom = city level. Higher = closer in.
        // Levels: 0 (world) - 21 (building)
    }

    GoogleMap(
        modifier = modifier,
        cameraPositionState = cameraPositionState,
        properties = MapProperties(
            isMyLocationEnabled = false,  // Set true only if you have location permission
            mapType = MapType.NORMAL
        ),
        uiSettings = MapUiSettings(
            zoomControlsEnabled = false,  // Most apps want custom UI
            myLocationButtonEnabled = false
        )
    )
}

That works. It also doesn’t do anything useful. The interesting part starts when you add markers, polylines, and continuous updates.


Markers — The Real Cost Model

A marker on Google Maps is conceptually simple: a position, an icon, optional info window. Performance-wise, it’s not free. Each marker is a draw call (or part of a batched draw call), each marker icon is a texture in GPU memory, each marker tap is a hit-test against the visible set.

The numbers that matter:

  • ~100 markers: no issues anywhere
  • ~500 markers: noticeable frame drops on mid-range devices when panning
  • ~1000 markers: jank on most devices, possibly OOM if marker icons are large
  • ~5000 markers: don’t do this without clustering

For the delivery tracking app: if you have 50 vehicles in the user’s viewport, you’re fine. If you have 5000 vehicles in the entire fleet and you’re trying to render all of them so the user can zoom into any region, you need clustering (covered in the next post in this Maps cluster).

@Composable
fun VehicleMarkers(
    vehicles: List<Vehicle>,
    onMarkerClick: (Vehicle) -> Unit
) {
    vehicles.forEach { vehicle ->
        Marker(
            state = rememberMarkerState(position = LatLng(vehicle.lat, vehicle.lng)),
            // ✅ rememberMarkerState binds the marker to a stable state object
            // The marker animates between positions when the state updates
            title = vehicle.driverName,
            snippet = “${vehicle.deliveryCount} deliveries today”,
            icon = vehicleIcon(vehicle.status),
            onClick = {
                onMarkerClick(vehicle)
                false  // Returning false also shows the info window
            }
        )
    }
}

// Cache marker icons — recreating BitmapDescriptors on every recomposition is expensive
@Composable
fun vehicleIcon(status: VehicleStatus): BitmapDescriptor {
    val context = LocalContext.current
    return remember(status) {
        when (status) {
            VehicleStatus.ACTIVE -> bitmapDescriptorFromVector(context, R.drawable.ic_truck_green)
            VehicleStatus.IDLE -> bitmapDescriptorFromVector(context, R.drawable.ic_truck_yellow)
            VehicleStatus.OFFLINE -> bitmapDescriptorFromVector(context, R.drawable.ic_truck_gray)
        }
    }
}

Two things worth lingering on:

1. rememberMarkerState binds the marker to a MarkerState object. When the position changes, Maps SDK animates the marker smoothly to the new location instead of teleporting. For a delivery tracker showing vehicle movement, this is the difference between “the truck appears to move” and “the truck teleports every 3 seconds.”

2. Marker icons must be cached. BitmapDescriptorFactory.fromResource(...) decodes a Bitmap each call. Calling it inside a recomposing composable allocates a new Bitmap every time the parent recomposes. For 50 markers recomposing 60 times a second during smooth scroll, that’s 3000 Bitmap allocations per second. remember(status) { ... } caches per status; if you have only 3 statuses, you decode 3 bitmaps total instead of 3000/second.


Continuous Marker Updates — The Real Performance Problem

Now the part your feedback specifically asked about. The user has the map open. Vehicles are streaming new locations every 3 seconds via WebSocket. Each vehicle’s marker needs to update its position. How do you do this without melting the device?

The naive approach — everything in one big StateFlow<List<Vehicle>> — recomposes the entire marker list every update.

// ❌ The performance trap
@Composable
fun NaiveVehicleMap(viewModel: DeliveryViewModel) {
    val vehicles by viewModel.vehicles.collectAsStateWithLifecycle()
    // vehicles is List<Vehicle>, updated every 3 seconds with new positions

    GoogleMap(...) {
        VehicleMarkers(vehicles, onMarkerClick = { ... })
        // Every 3 seconds: the entire VehicleMarkers composable recomposes
        // For 50 vehicles, that’s 50 marker updates queued onto the GL thread
        // The map appears to stutter every 3 seconds
    }
}

The fix: each marker should observe its own vehicle’s state independently, so a single vehicle’s position update only affects its own marker.

// ✅ Per-marker state observation
class DeliveryViewModel : ViewModel() {
    // Per-vehicle StateFlow keyed by ID
    private val vehicleStates = mutableMapOf<String, MutableStateFlow<Vehicle>>()
    private val _activeVehicleIds = MutableStateFlow<Set<String>>(emptySet())
    val activeVehicleIds: StateFlow<Set<String>> = _activeVehicleIds.asStateFlow()

    fun observeVehicle(id: String): StateFlow<Vehicle> =
        vehicleStates.getOrPut(id) {
            MutableStateFlow(initialVehicleStateFor(id))
        }.asStateFlow()

    init {
        viewModelScope.launch {
            vehicleStream.collect { update ->
                vehicleStates[update.id]?.value = update
                // ✅ Only the StateFlow for THIS vehicle emits
                // Only the composable observing THIS vehicle recomposes
            }
        }
    }
}

@Composable
fun PerVehicleMarker(
    vehicleId: String,
    viewModel: DeliveryViewModel,
    onClick: (Vehicle) -> Unit
) {
    val vehicle by viewModel.observeVehicle(vehicleId).collectAsStateWithLifecycle()

    val markerState = rememberMarkerState(
        position = LatLng(vehicle.lat, vehicle.lng)
    )

    LaunchedEffect(vehicle.lat, vehicle.lng) {
        markerState.position = LatLng(vehicle.lat, vehicle.lng)
        // ✅ Position update only triggers when THIS vehicle moves
        // The Maps SDK animates the marker to the new position
    }

    Marker(
        state = markerState,
        title = vehicle.driverName,
        icon = vehicleIcon(vehicle.status),
        onClick = { onClick(vehicle); false }
    )
}

@Composable
fun OptimizedVehicleMap(viewModel: DeliveryViewModel) {
    val activeIds by viewModel.activeVehicleIds.collectAsStateWithLifecycle()

    GoogleMap(...) {
        activeIds.forEach { id ->
            key(id) {
                // key() ensures Compose tracks each marker by ID, not position
                PerVehicleMarker(id, viewModel, onClick = { ... })
            }
        }
    }
}

The mental model: scope state observation as narrowly as possible to what changes. The set of which vehicles are visible changes rarely (a vehicle joins or leaves the fleet, maybe once per minute). Each individual vehicle’s position changes frequently (every 3 seconds). These need separate flows so the rare change doesn’t recompose everything that depends on the frequent one.

This is the same Compose performance principle from the Animations and Modifiers posts — reading state at the lowest scope where it’s used. Maps applications expose this principle particularly clearly because the data is genuinely high-frequency.


Polylines — Routes That Update Live

For a delivery app: when a driver accepts a delivery, you draw the route from current location to destination. As they drive, the “completed” portion updates; possibly the route itself recalculates if traffic changes.

@Composable
fun DeliveryRoute(
    route: Route,
    completedDistance: Double  // Updates as driver progresses
) {
    val (completedSegment, remainingSegment) = remember(route, completedDistance) {
        splitRouteAtDistance(route, completedDistance)
    }

    Polyline(
        points = completedSegment,
        color = Color.Gray,  // Already-driven portion in gray
        width = 8f,
        zIndex = 1f
    )

    Polyline(
        points = remainingSegment,
        color = Color(0xFF4285F4),  // Remaining route in blue
        width = 10f,
        zIndex = 2f,  // Higher zIndex draws on top
        pattern = listOf(Dash(20f), Gap(10f))  // Optional dashed pattern
    )
}

Polyline performance considerations:

1. Point count matters. A polyline with 10,000 points (a long route at high resolution) hits performance issues during rendering and even more during repeated updates. The route geometry from a routing service typically uses an encoded polyline format (~1 character per point in many encodings); decoded, a route across a city is often 200–800 points, which is fine.

2. Don’t recreate the polyline on every progress update. Splitting the polyline into “completed” and “remaining” based on the driver’s progress is fine if you do it efficiently. Recomputing both segments by walking from the start of the route every update is O(n) per update. Cache the segment index of the driver’s current position, only walk forward as they progress.

3. Polyline updates re-render the entire shape. Unlike markers (which animate smoothly), polyline updates are essentially “remove old, add new” operations. For frequently-updating routes, this is expensive. The pattern that works: only update the polyline when the user has crossed a meaningful distance (every ~50 meters), not on every GPS tick.


Camera Control — Following the Action

For tracking a moving vehicle, the camera should follow the vehicle as it moves — smoothly, not snapping. The Maps Compose API:

@Composable
fun TrackingCameraEffect(
    targetLocation: LatLng,
    cameraPositionState: CameraPositionState,
    isFollowing: Boolean
) {
    LaunchedEffect(targetLocation, isFollowing) {
        if (isFollowing) {
            cameraPositionState.animate(
                update = CameraUpdateFactory.newLatLngZoom(targetLocation, 16f),
                durationMs = 1000
            )
            // animate() is suspending — returns when the animation completes
            // CameraUpdate types: newLatLng, newLatLngZoom, newLatLngBounds, newCameraPosition
        }
    }
}

The pattern most delivery apps use: auto-follow until the user manually pans, then stop following. The user can re-enable follow with a button. Detect manual pan via cameraPositionState.isMoving combined with cameraPositionState.cameraMoveStartedReason:

@Composable
fun MapWithFollowMode(
    viewModel: DeliveryViewModel
) {
    val cameraPositionState = rememberCameraPositionState()
    var isFollowing by remember { mutableStateOf(true) }
    val driverLocation by viewModel.driverLocation.collectAsStateWithLifecycle()

    // Detect when the user manually pans, disable follow
    LaunchedEffect(cameraPositionState) {
        snapshotFlow { cameraPositionState.cameraMoveStartedReason }
            .collect { reason ->
                if (reason == CameraMoveStartedReason.GESTURE) {
                    isFollowing = false
                    // ✅ User panned manually — stop following
                    // The user can re-enable follow with a UI button
                }
            }
    }

    // When in follow mode, animate camera to track driver
    LaunchedEffect(driverLocation, isFollowing) {
        if (isFollowing && driverLocation != null) {
            cameraPositionState.animate(
                CameraUpdateFactory.newLatLngZoom(driverLocation!!, 16f),
                durationMs = 1500
            )
        }
    }

    GoogleMap(cameraPositionState = cameraPositionState) {
        // ... markers, polylines
    }
}

The cameraMoveStartedReason values: GESTURE (user touch), API_ANIMATION (your code called animate()), DEVELOPER_ANIMATION (alternative for non-animated camera changes). Distinguishing gesture from your own animation is what enables “stop following only when the user pans, not when I auto-follow.”


Viewport-Driven Optimization

Here’s the optimization most Maps tutorials never cover, and it’s where production apps win on performance: only render what’s visible.

The pattern: subscribe to camera state, compute the visible bounds, query only entities inside those bounds. As the user pans, the visible set changes; markers outside come off, markers entering come on.

@Composable
fun ViewportFilteredMarkers(
    cameraPositionState: CameraPositionState,
    viewModel: DeliveryViewModel
) {
    // Observe the current visible bounds, recomputed only when camera idles
    val visibleBounds by produceState<LatLngBounds?>(initialValue = null, cameraPositionState) {
        snapshotFlow { cameraPositionState.isMoving }
            .filter { !it }  // ✅ Only emit when camera STOPS moving
            .collect {
                value = cameraPositionState.projection?.visibleRegion?.latLngBounds
            }
    }

    val visibleVehicles by remember(visibleBounds) {
        derivedStateOf {
            val bounds = visibleBounds ?: return@derivedStateOf emptyList()
            viewModel.allVehicleIds.filter { id ->
                val v = viewModel.observeVehicle(id).value
                bounds.contains(LatLng(v.lat, v.lng))
            }
        }
    }

    visibleVehicles.forEach { id ->
        key(id) {
            PerVehicleMarker(id, viewModel, onClick = { ... })
        }
    }
}

Why filter on camera-idle, not on every camera position update: every frame of a pan is a different visible region. Re-filtering 60 times a second is wasted work. Wait until the camera stops moving (a few hundred ms after the user finishes panning), then re-filter. The user perceives no difference; the device does ~99% less work.

For 5000 vehicles in the fleet with 50 visible at any moment, this drops marker rendering work by 100x. Pair it with marker clustering for distant zoom levels (next post in this cluster) and you have a Maps app that scales.


Location Updates Without Killing Battery

The driver app side: streaming the driver’s location to the server every few seconds for the dispatcher’s map view. Same problem as the Uber post but worth recapping with implementation specifics for Maps consumers.

// FusedLocationProviderClient with battery-conscious config
private fun createLocationRequest(): LocationRequest =
    LocationRequest.Builder(
        Priority.PRIORITY_HIGH_ACCURACY,
        4_000L  // 4-second interval
    )
        .setMinUpdateIntervalMillis(2_000L)
        .setMinUpdateDistanceMeters(10f)
        // ✅ Don’t emit if vehicle hasn’t moved 10 meters
        .setWaitForAccurateLocation(false)
        .setMaxUpdateDelayMillis(8_000L)
        // ✅ Allows OS to batch updates up to 8s for power efficiency
        .build()

// Power-mode awareness: reduce update frequency when battery is low
private fun adjustForBatteryLevel(): LocationRequest {
    val batteryLevel = currentBatteryLevel()
    return when {
        batteryLevel > 30 -> createLocationRequest()
        batteryLevel > 15 -> LocationRequest.Builder(Priority.PRIORITY_BALANCED_POWER_ACCURACY, 8_000L).build()
        else -> LocationRequest.Builder(Priority.PRIORITY_LOW_POWER, 30_000L).build()
        // Below 15% battery: drop to 30-second intervals at lower accuracy
        // The dispatcher sees less precise tracking but the driver finishes their shift
    }
}

Three knobs that matter:

Priority. HIGH_ACCURACY uses GPS + Wi-Fi + cell. BALANCED_POWER_ACCURACY primarily Wi-Fi + cell, less GPS. LOW_POWER mostly cell. The accuracy degradation is real (15m → 50m → 500m typical), but battery savings are substantial.

Min update distance. If a vehicle is parked, you don’t need updates. setMinUpdateDistanceMeters(10f) means “don’t fire callbacks unless we’ve moved 10m since the last one.” This drastically reduces no-op work when stationary.

Max update delay. setMaxUpdateDelayMillis(8_000L) tells the OS “you can batch up to 8 seconds of updates and deliver them together.” The OS’s power scheduler can group your work with other apps’ work, waking the radio less often. Lower = more responsive, higher = better battery.

The combination matters more than any one knob. Naive HIGH_ACCURACY with 1-second interval and no batching drains the battery in hours. The above config sustains tracking through an 8-hour driver shift.


Map Lifecycle — The Often-Missed Detail

Google Maps SDK has a lifecycle that doesn’t automatically tie to your composable’s lifecycle in older integration patterns. With maps-compose the integration is mostly automatic, but a few things to know:

1. The OpenGL context is expensive. Creating a map view allocates a GL context, a tile decoder pipeline, and several megabytes of cache. Destroying it tears all that down. If your user navigates between map screens frequently, the cost adds up.

2. Map state can be saved across config changes. The camera position survives rotation if you use rememberCameraPositionState with no key. Marker states with rememberMarkerState work the same way.

3. MapView for non-Compose code. If you have legacy XML screens, MapView requires manual lifecycle forwarding (onCreate, onResume, onPause, onDestroy). Skipping these leaks the GL context. Compose’s GoogleMap handles this automatically.

4. Backgrounded maps consume memory but no CPU. When your map screen goes to the background, the GL surface is paused but tile cache and marker state remain in memory. If you navigate to a heavy screen, the system may reclaim that memory. On return, the map re-renders from cache (fast) or re-fetches (if cache was reclaimed). Either way, it works — but the user briefly sees blank tiles on a memory-pressured device.


Custom Map Styles

The default Google Maps style is recognizable and busy. For a branded delivery app, you probably want a custom style: muted greys for backgrounds, your brand colors for highlights, hidden POIs that aren’t relevant.

Generate a style at mapstyle.withgoogle.com (Google’s official styling tool, JSON-based) or use a curated style from a third party.

@Composable
fun StyledDeliveryMap() {
    val context = LocalContext.current
    val mapStyle = remember {
        MapStyleOptions.loadRawResourceStyle(context, R.raw.map_style_delivery)
        // Reads from res/raw/map_style_delivery.json
    }

    GoogleMap(
        properties = MapProperties(mapStyleOptions = mapStyle)
    ) { ... }
}

The JSON file is a list of styling rules — feature types and what styling to apply. Saturate the road colors to your brand, hide bus stops if irrelevant to delivery, lift POI density to focus on the route.

Performance note: extremely heavy styles (hundreds of rules) can slow tile rendering. The default styles and most custom ones from the Google tool are fine; bespoke styles with hundreds of rules need testing.


When You Outgrow Google Maps

Google Maps SDK is the right answer for most apps. It works, it’s familiar to users, the data is excellent, the SDK is mature. Reasons you might consider alternatives:

Cost. Maps API has a generous free tier ($200/month credit) but heavy users (millions of map loads, large request volumes for routing/places) pay real money. At scale, Mapbox and OSM-based solutions can be 3-10x cheaper.

Customization. Need raster overlays, custom tile sources, vector data control, or 3D rendering beyond what Google offers? Mapbox is more flexible.

Offline-first apps. Google Maps doesn’t have great offline support for arbitrary regions. Mapbox and OSMDroid both have first-class offline tile management.

Markets where Google services are restricted. China, certain other regions: Google Maps is unusable, requires Tencent Maps, Baidu Maps, or AMap.

Most apps don’t outgrow Google Maps. Most apps that think they have outgrown it can solve their actual problem with the optimization patterns in this post (viewport filtering, marker clustering, scoped state observation). The full Mapbox vs. Google vs. OSM comparison gets a dedicated post in this cluster (#33 on the runway).


Pitfalls Worth Calling Out

API key leakage. Hardcoded keys, keys in git, keys without restrictions. Restrict your key to your app’s package + signing fingerprint. Monitor usage in Google Cloud Console. Set quotas to prevent runaway billing if a key does leak.

Recomposition storms on marker updates. The list-of-vehicles pattern shown above is the trap. Per-vehicle state observation is the fix. If your map stutters during data updates, this is almost always the cause.

Marker icon allocation. Decoding bitmap descriptors inside composables without remember. Drains memory and CPU. Cache by status / type.

Polyline rebuilding on every GPS tick. The driver moved 5 meters, you reconstructed a 600-point polyline. Update polylines on meaningful distance thresholds, not every callback.

Battery drain from over-tight location intervals. 1-second HIGH_ACCURACY without batching is the “why is my driver app draining 30% per hour” pattern. Use the FusedLocationProviderClient knobs, especially setMaxUpdateDelayMillis.

Forgetting that the map needs network. Going to a new region, no cached tiles, slow connection: the map appears broken. Show appropriate loading UI; don’t pretend the map is offline-capable when it isn’t.

OEM battery optimization killing the location service. Same lessons as the OEM post in this blog: foreground service of the right type, battery optimization exemption, OEM-specific user guidance. The map UI works fine; the data feeding it stops if the location service gets killed.


Closing

Google Maps SDK is one of those APIs that’s easy to start with and surprisingly deep once you ship at production scale. The tile pipeline, the rendering performance characteristics, the state-observation patterns that prevent recomposition storms, the camera state machine, the battery-conscious location streaming — these are what separate “the demo works” from “the app survives a day in production.”

The patterns from this post apply directly to delivery, ride-sharing, fitness GPS tracking, navigation, real-estate, store-locator, and any other Map-centric app. Future posts in this Maps cluster will go deeper: marker clustering for thousands of points, the Mapbox vs. Google Maps vs. OSMDroid comparison, offline maps strategies, and custom map overlays for things like heat maps and weather data.

For your delivery tracker: scoped state per vehicle, viewport-filtered rendering, smooth marker animation via rememberMarkerState, polyline updates on meaningful thresholds, follow-mode camera with gesture-aware deactivation, FusedLocationProviderClient with battery awareness. Six patterns; covers most production needs.

Happy coding!

1 views · 0 comments

Comments (0)

No comments yet. Be the first to share your thoughts.