November 23, 2024

MediaBizNet

Complete Australian News World

iPad AI teacher demo points to an amazing new world for students

iPad AI teacher demo points to an amazing new world for students

If you haven’t done so yet a witness At yesterday’s OpenAI event, I highly recommended doing just that. The news headline was that the latest GPT-4o works seamlessly with any combination of text, audio and video.

This includes the ability to “show” a GPT-4o app the screen recording you’re capturing to another app — and that’s a capability the company demonstrated with an impressive iPad AI guru demo…

GPT-4o

OpenAI said that the “o” stands for “omni.”

GPT-4o (“o” for “omni”) is a step toward more natural human-computer interaction – it accepts as input any combination of text, audio, and image and generates any combination of text, audio, and image as output.

It can respond to voice input in as little as 232 milliseconds, with an average of 320 milliseconds, which is similar to human response time(opens in a new window) in conversation. […] GPT-4o is particularly better at understanding vision and sound than current models.

Even the audio aspect of this is a big deal. Previously, ChatGPT could accept voice input, but converted it to text before working with it. In contrast, GPT-4o actually understands speech, so it skips the conversion stage altogether.

As we noted yesterday, free users also get a lot of features that were previously limited to paying subscribers.

iPad AI teacher demo

One of the capabilities demonstrated by OpenAI is the ability of GPT-4o to see what you are doing on the iPad screen (in split-screen mode).

The example shows an AI teaching a student who has a math problem. You can hear that GPT-4o understood the problem at first and wanted to solve it immediately. But the new form may be interrupted, in which case he is asked to help the student solve it himself.

READ  Nintendo presents the heroes of Xenoblade Chronicles 3

Another ability that emerges here is that the model claims to detect emotions in speech, and can also express emotions themselves. For my taste, this was a bit overdone in the beta, and that’s reflected here – the AI ​​is perhaps a little on the condescending side. But this is all adjustable.

Effectively, every student in the world can have a private tutor with this kind of ability.

How much of this will Apple integrate?

We know that AI is the primary focus of iOS 18, and that a deal is being finalized to bring OpenAI features to Apple devices. While at the time it was described as being for ChatGPT, it now seems very likely that the actual deal is for access to GPT-4o.

But we also know that Apple has been working on its own AI models, with its own data centers running its own chips. For example, Apple was developing it king A way to let Siri understand app screens.

So we don’t know exactly what GPT-4o capabilities the company will bring to its devices, but this feature seems so perfect for Apple that I have to believe it will be included. This is really using technology to empower people.

picture: OpenAI. Benjamin Mayo contributed to this report.