You might know something like the back of your hand, but Google’s next project might know the back of your hand better than you.
Our phones can do a whole lot more than just make calls, but they soon might be able to work out whether you need to see a doctor, and fast.
While a wearable can offer a glimpse into your health by tracking vitals like heart rate, ECG, and blood oxygen levels (depending on the device you wear), Google is working on a tool that can look at your skin and work out whether you have an issue with your hair, your nails, and your skin, and whether you should go see a doctor to have it checked out.
It’s one of the many things that came out of Google I/O this year, alongside the announcement of Android 12 and the Project Starline evolution of video chat.
However Google’s dermatology tool is a little bit different, and relies heavily on artificial intelligence to make its concept work.
Essentially, Google will use your phone’s camera to take images of what you’re concerned about on your skin, hair, or nails from different angles, and then will ask you questions about your skin type, the symptoms, and how long you think you’ve had they issue for.
Google will then analyse this information and compare it to a database of 288 conditions to determine what your skin issue is closest too, providing similar results from around the web, allowing you to make a determination on whether your condition is serious enough to warrant a trip to the GP, or just something you can set aside for the moment.
Developed from either three years of machine learning research, Google’s AI-based dermatology tool is not intended to be a diagnostic tool, though. As such, you shouldn’t infer this will act like Dr. Google and give you the answer to what ails you.
However it could offer a bit of a helping hand, with a small nudge in the right direction at the least.
And if anything, it’s a neat use of artificial intelligence that could pave the way for similar uses of analysis in the future.