Why should you choose Rocksauce Studios? Long story short, you want it done right! When yo...
So many people are yet to understand “How Google cloud services are improving UX/UI Design” the truth is that Google regularly updates a line of devices that showcase best-in-breed practices of how their software should work.
On Tuesday, October 9th, Google had its annual hardware event. And, while most of their new hardware (I’m looking at you, a notch on the Pixel 3XL) was impressive, what really stole the show was their focus on driving digital transformation innovation through software.
With the launch of the first Pixel phone, it was clear that Google has embraced Steve Jobs’ mantra at Apple, “Couple the software with the hardware for the best experience.” Google had historically been a software company, building a product deployed on millions of machines they had no hand in creating. Google’s Pixel changed this.
Now, Google regularly updates a line of devices that showcase best-in-breed practices of how their software should work. New hardware is created to support new features, and new features drive the push of new hardware.
The software took center stage at the Google Pixel 3 event, just as it did during Google IO earlier in the year. In attendance and on the internet, audiences were mesmerized by the seamless interaction between person and technology.
Google’s utilization of voice-driven AI, “only what you need” UX/UI flows, and combination of interactions doesn’t just define what’s best for Android, but also gives a roadmap to how these type of natural interactions can apply to multiple industries – like medical software, construction transformation, or B2B internal Enterprise needs.
At Google IO 2018, demonstrators had Duplex, their AI driven voice-interface, cold-call a hair salon and book an appointment. Audiences were amazed at the fluidity of the interface, but even more, it was astounding that the employee had no idea that the “person” booking the appointment was a machine. This is one of the ways how Google cloud services are improving UX/UI Design currently.
Imagine how useful a similar voice-responsive technology would be if deployed in an Enterprise software environment, like medical or healthcare. Nurses and patients could get up-to-date information, about specific procedures or treatments, without the need to search through hundreds of data rows.
Training or education takes on a next-level interaction, whether on-the-job to update a construction company foreman of the current policies or students learning the details of their career.
“Click here to know more about Rocksauce Studio”
Masters of UX and UI, Google has created a viable design-system drive how people work seamlessly with AI. The more intelligent a system is at giving you only what you need when you need it, the less cluttered it becomes. Digital transformation efforts should take heed of what Google is doing here.
Google Hub at 6:45 am shows you what you need to start the day. Weather, to help you dress correctly. Traffic, so you can plan your schedule. News that may be relevant to accompany you as you’re preparing or in commute.
Hub’s Google Assistant isn’t making you swipe, launch or find relevant items. It’s making an educated guess based on your patterns of behavior and environment, then serving them up. Need something different, just ask, and its there. Google will learn that, too.
Overwhelming interfaces are one of the biggest complaints about older Enterprise software. Usually designed for Windows95 or earlier, little thought has gone into how the person sitting at the desk wants to engage with the content.
Innovating in software means not only understand the problems users have but how they solve those problems. Understanding what is their desk environment like? What about the lighting? Knowing what do they need to see when they are actively engaged in a task? How can their next needs be predicted, and made immediately apparent?
Sure, this makes software look nicer to use, but it also makes it better to use. That means an increment in productivity and stakeholder happiness because they need not to remember, “Always save the document twice, otherwise, the backup copy gets lost, due to a bug in the system.” That one’s a true story from one of our clients.
“Click here if you want to improve your UI/UX design skills”
Google created a pretty amazing feature with its AI known as Google Lens – essentially a way for AI to read a photo, and parse the data.
At the Pixel event, Google pushed deeper integration to the address off a menu or information about a product you may want to buy.
Working these techs into eCommerce or similar technologies easily makes sense. But what if they are to apply it to other instances? Image recognition feels like magic, but it really just creates a robust database of information and knowing to spot for similarities and anomalies.
Roll this type of technology into a safety application for a manufacturing enterprise innovation, and you create the ability to spot workplace problems. That means efficiency, safety, and big cost savings from liability or mistakes.
At first blush, this doesn’t seem like an advanced concept that Enterprises would get a benefit from, but let’s unpack the concept.
Bracketing photos mean capturing not a single millisecond in time, but the moments around that millisecond. This considers the sudden elements of day to day life –you can capture if someone moves or a more natural, enjoyable moment, then it’s advantageous. Roll this concept into file saving or even interface actions, and you’re seeing the brilliance of this type of system.
Many online-drive systems, like Box.com or DropBox, offer versioning on a file-level. Software like Photoshop gives users access to an extensive history of actions they’ve taken. And everyone knows what CMD or CTRL-Z does, and how valuable it is. But all of this has limit
Software that factors in bracketing, on an item by item scale, opens the doors for massive amounts of flexibility, editing, and data recovery. Particularly in a world that’s moving to the concept of “shared documents,” as Google’s own G-Suite of tools has done, letting multiple people edit at one time. Using these concepts, but allowing a single form field, deleted image or headline, be accessible in a history, removes errors and saves time.
Steve Jobs once coined the phrase, “It just…works!” At the time, he was touting Apple’s near-magical ability to make intuitive software that gave users what they wanted, when they wanted, and got out of their way. In comparison to the crowded menu-bars of Microsoft or other popular software at the time, he was right.
Today, we don’t need to rely solely on great interfaces to make this happen. Deploying machine learning and AI to work inside of software, predicting next steps or actions of users, make everything better.