Crypto Price Prediction

Ray-Ban Meta Glasses Upgrade with AI Integration Sets New Tech Standard

With the tech industry rapidly developing new possibilities, Meta introduces a huge update for their Ray-Ban Meta cameras’ glasses.

The upcoming update will infuse the glasses with an A.I. ability that features the sense of perception, and the capacity to articulate real-world environs like the virtual A.I. persona in the movie ‘Her‘.

The Ray-Ban Meta Glasses, which are available in a set of frames starting at $300 and in lenses starting at $17, have, for the most part, played as tools to take pictures and video clips and listen to music. On the other hand, the involvement of the brand new A.I. software results in the company’s key features being utilized more.

Ray-Ban-Meta-Glasses
Image Credit-Meta

With this advanced software now available, the glasses can carry out a range of activities, e.g. scanning popular attractions, translating languages, and identifying breeds of animals and tropical fruits, as well as a lot of other things.

Utilizing the A.I. software is simple: only a beam and a phrase, “Hey, Meta” needs to be addressed that can be followed with a command such as “Tell me the breed of this dog.” The AI answers through tiny speakers embedded inside the glasses.

The innovative and unconventional nature of the A.I. software drew Brian X. Chen, a tech columnist who had reviewed the Ray-Bans previously, and Mike Isaac, a Meta correspondent who incorporated the smart glasses into his cooking show.

When they learned about the update, they were ready to experiment. Meta kindly provided them with early access, allowing them to experiment with the tech over the past few weeks.

According to Meta’s representative, this is because A.I. technology is a new field, therefore, glasses may not always be precise, yet users’ feedback will only help to improve these glasses in the future.

These round-tinted glasses let us take a glimpse of our future, but they are sometimes not very accurate. Sometimes, when they try to identify an animal or a fruit, they can just look a bit awkward, probably because the camera itself isn’t very good.

Secondly, just like talking to virtual assistants in public doesn’t feel normal. We can’t guarantee it will ever feel like that, normal to us.