Mobile App are more complex than desktop apps

Mobile apps are now more complex than desktop apps

It is fair to say that mobile apps are now more complex that desktop apps, don’t you think so? You might ask: what about Photoshop, Excel, and this other office app that you use? It’s true that a lot of features have been built into those applications during the past 3 decades. Assuming that mobile devices, in forms that we recognize today, will still be around a few decades from now, they could accumulate as many features too. In the meantime, mobile apps have introduced new types of complexities to the app ecosystem. For example:

User interaction and response patterns

Desktop apps have been adding features on an ever growing dashboard of buttons, links, and menus. That’s why we have been adding larger or additional screens to our work stations. Mobile devices introduced touch screens small enough to fit on our palms. They also allowed a larger set of user interaction and response patterns that go beyond pointing and clicking. Touch screen gestures include tap, double tap, drag, flick, pinch, spread, press, rotate, adjust, or bundle.  More complex gestures can be constructed from simple gestures for interacting with mobile apps. The mouse and pointer way of interacting with a computer is a stick and stone technology compare to what touchscreens can do.

When developing a mobile app, we dedicated a significant portion of the time into designing user interfaces and user experience. Specialized developers and designers in this field are called the UI and UX specialists. Testing different use case scenarios on mobile devices is also more complex than desktop apps. That’s because there is a larger set of combined workflows and interactions that need to be tested and refined.

Hardware that listens, feels, and watches

The idea of machines capable of listening, feeling, and watching us is both creepy and fascinating in the same time. Mobile apps can interact with device sensors such as camera, microphone, accelerometer, vibration, compass, GPS, flashlight, and – I kid you not – a barometer! Mobile devices know their locations and positions. They can see, listen, and feel like no other computer we used before. Metaphorically, they are more aware of their surroundings than their desktop ancestors. The wider array of data inputs makes mobile apps behave on a more sophisticated level when users engage with them.

Integration with anything as a service (Xaas)

Making API calls to access storage, data, content, and AI services in the cloud isn’t a new concept, but it took a while to become common place in desktop apps. That’s mainly because cloud infrastructure and services weren’t as abundant as they are now. Mobile apps mainly rely on web services for storage, data processing, and information access. They can even consult AI and deep learning services and give us recommendations. It isn’t unusual for a mobile app to simultaneously interact with multiple web services in the back-end. Such real-time access to the data, information, and insights renders a mobile app as an intelligent entity to the mobile users.

Next time that you launch an app on your iPhone or Android device, you may possibly have more appreciation for the underlying complexities behind that little app icon.