Android Performance Improvements for Slideshare

Co-authors: Karthik Ramgopal and Josh Clemm

Slideshare for Android makes it easy to learn on the go. Members can explore, save, or search across millions of presentations and documents directly from their Android device.

These presentations are uniquely challenging for our app. Each separate presentation can have hundreds of slides. And each slide is represented as an image. This creates large data models and numerous high resolution images. This is quite a challenge on Android, where low end devices have paltry amounts of RAM, and can lead to many crashes.

Therefore, to build a compelling, smooth, and crash-free experience, we needed to improve the app under the hood. We focused our optimizations on three areas: image loading, JSON parsing, and data model storage. Below, we'll explore these three areas and how our iterations have made Slideshare for Android better for our members.

Slideshare for Android

We've tried a few open source image loading libraries like Picasso and Glide in the past, but we always ran into memory issues because of the sheer magnitude of the image data we handle. Even if the app didn't crash, the UI would stutter when scrolling.  Large bitmaps cause UI stutters because of Garbage Collection pauses. This is especially true on Dalvik systems where stop-the-world GC_FOR_ALLOC is common.

We needed to optimize our image loading and data processing pipelines to keep RAM usage down as well as provide an optimal experience to our members by avoiding out of memory crashes.

To alleviate the pain of GC and to reuse bitmap memory, the Android BitmapFactory provides an inBitmap option. However, prior to Kitkat it comes with its own set of caveats requiring the reused bitmap to be the exact same size, configuration, and image format. This limitation is too severe for Slideshare since our images are fairly heterogeneous. We did not want to change the server-side for client-side restrictions. Even if we went with these limitations, we also saw bitmaps were frequently being kicked out of the pool on account of memory pressure.

Since we’ve run into this image loading issue on Android in almost all our LinkedIn applications, and none of the open source libraries out there were cutting it, we decided to write an in-house library to solve this.

Our in-house image loader library has support for intelligent bitmap pooling and recycling using reference counted Bitmaps. It uses different strategies for bitmap pooling and recycling depending on the OS version:

  • Pre-Lollipop: Bitmaps are decoded using the inPurgeable flag, forcibly pinned to ashmem, and are unpinned and recycled, when the reference count drops to zero. Since bitmaps are stored in ashmem and not the Java heap, the garbage collector never pauses due to bitmap allocation/collection.

Pre-lollipop image loading flow
  • Lollipop onwards: The bitmaps are pooled and inBitmap is used to reuse bitmaps when decoding. Here, we could not use the previous technique since Google deprecated the inPurgeable flag. Also, GC is mostly background in ART, meaning that we can get away with having bitmaps on the heap, without significant UI hiccups.
Lollipop and onward image loading flow

We use JSON as the transport format for client server communication. Since all our app UI is resource-driven, we deserialize JSON into data models (POJOs in Android), and serialize them into JSON when sending them to the server. Android’s built-in JSON parser is very slow and memory-heavy and does not have built-in stream parsing support before Honeycomb. As our payloads can be rather large and nested, stream based (SAX) parsing is imperative for performance.

Stream-based parsing unfortunately involves writing a lot of hand-written boilerplate code to handle JSON tokens and edge conditions, which is very prone to casual programmer error. To avoid this, we wanted a system where this boilerplate could be generated for us on the fly. Jackson provides an ObjectMapper API which handles this via runtime annotations. However, ObjectMapper uses reflection (which does not perform very well on Android), causing startup slowdowns and an initial spike in memory when the Object tree is built.

After investigating parsing solutions, we picked LoganSquare for two major reasons:

  • Annotations are very similar to ObjectMapper, avoiding boilerplate code, but they are processed at compile time, avoiding reflection.

  • The generated code uses the Jackson streaming API, which is extremely high performing and frugal on memory.

You can refer here for a sample model with every field annotated. We can parse the network JSON response into a Slideshow object as shown in the gist.

Ideally, mobile apps use a database caching layer to store and retrieve data. So it is important that we use a fast and lightweight database for Android. We started off by using SQLite for our caching mechanism. But SQLite comes with a transaction overhead for reading/writing data. Our data increased in size and complexity too. A key-value store, on the other hand, eliminates this transaction overhead. We came across SnappyDB, a key-value database for Android, which outperforms SQLite in reads/writes irrespective of size and complexity.

We first integrated SnappyDB as a prototype to store search result histories. We then replaced SharedPreferences with SnappyDB. We noticed bottlenecks with SharedPreferences after the number of stored variables became large.

We plan to continue rewriting our other tables with SnappyDB in the future. We like that SnappyDB can be extended to working with Serializable Objects/Arrays, not just the primitive types. Plus, it is faster to store and retrieve objects/arrays as it uses Kryo to serialize them.

To quantify the extent of our improvements, we benchmarked them with many devices. Here are some results from a Nexus 5:

  • We used Memory Monitor tool to track our app memory usage over time. The regular memory usage was cut down by 47 percent and the peak memory usage by 40 percent.

  • We tracked our Newsfeed screen for GC_FOR_ALLOC pauses. These pauses occur when app attempts to allocate memory when heap was already full, so the system has to stop the app and reclaim memory. The app becomes unresponsive as a result. These pauses were cut down by 92 percent.

  • Finally thanks to LoganSquare, our JSON parsing time was reduced by 15 percent for our large JSON payloads.

In addition to these quantitative benchmarks, we also found that the app was now extremely fluid without any stuttering, especially when scrolling long lists of slides. And perhaps most importantly, these changes helped make our Android app crash-free for 99.9 percent of our members!

By focusing on image loading, JSON parsing, and data model storage, Slideshare for Android is better than ever before. We plan to continue migrating more tables over to SnappyDB, further optimize our image loading library, and looking out for even more optimizations. Plus, we will be spreading the word to all the other LinkedIn Android apps! Download Slideshare today and let us know what you think.

The Slideshare Android team discussing app optimizations