Google Lens will help you in detecting rashes, skin conditions and more; Here's how it works
Google Lens can now analyse rashes on the skin to give users a reference point in case they are "not sure how to describe something else" on the body. Google suggests the Lens will help you describe the condition based on the result you get after scanning the rash or mole.
A new feature from Google Lens makes it easier to recognise particular medical issues. The phone's camera may be used to take pictures of skin rashes or irritations, which Google Lens can now analyse and identify as probable skin disorders. This feature functions similarly to the existing image recognition feature on Google Lens, where users can also choose an image from their photo gallery to utilize the new skin condition detection functionality.
The most current skin condition recognition function in Google Lens allows users to recognise moles or rashes on their skin without having to offer any textual descriptions. By taking a picture using the Google Lens app or choosing one from the device's gallery, you may start using this feature.
Also Read | Google Maps: 3 new features will make your life easier
Users may swipe up to view a horizontal row of search results displaying the names of different skin diseases after taking or selecting an image. A scrollable section with aesthetically comparable photographs is also offered below the findings for additional reference.
Although Google's expert image recognition technology is used for the skin condition detection tool in Google Lens, it is not meant to replace a dermatologist's professional medical diagnosis and can only offer preliminary understanding of the nature of skin conditions.
Also Read | WhatsApp update: Messaging app introduces 'Call-back' feature for missed calls
Google introduced this feature because it notes that it is difficult at times to describe an odd mole or rash on the skin with just words, and sometimes a picture might aid to explain more.
In addition, the Google Lens app offers a variety of additional features, as stated in the blog post by Google, in addition to the most current improvements in the area of health. These include helping with arithmetic assignments, making it easier to match products while shopping, locating foods that are comparable at nearby restaurants, and even translating menus, signs, and posters into more than 100 other languages.