Sora is shaking up video production. It is now remarkably easy to generate production-quality clips with AI using just a few prompts. I was curious to see if Sora lived up to the hype by creating a AI-powered video that looked realistic.
As a journalist and writer currently undergoing IVFI had the idea of doing a talk or presentation on the fertility industry, particularly cryobanking. A story on this subject deserves to be as realistic as possible. But maybe Sora could help me get started and possibly create some of the aerial shots and street shots in B-roll form to complement my commentary.
I can be camera shy, so I wanted to see if Sora could create a music video that wouldn’t require me to worry about lighting, lines, and accidental laughter.
When fertility meets machine learning, something gets lost in the algorithm. Sora couldn’t even spell the word uterus.
Here is my hands-on experience with Sora. I will also share what I learned identify AI-generated fakes videos.
(Disclosure: Ziff Davis, CNET’s parent company, filed a lawsuit in April against OpenAI, alleging that it violated Ziff Davis’ copyrights in the training and operation of its AI systems.)
My directorial debut (IA)
Sora 2 from OpenAI was released in September 2025, an expansion of the flagship 2024 model. Starting today, Sora is free, and Sora 2 no longer requires an invitation or a code.
I used the original Sora and went straight to the chatbox to describe my video.
Don’t miss any of our unbiased technical content and lab reviews. Add CNET as your preferred Google source.
First prompt: “I’m following IVF as a journalist/writer. I want to produce an explainer video, which will be part of a series of videos, ultimately leading to an expose on the fertility industry. For this first explainer video, create a montage of IVF clips, display news headlines, and create custom graphics.”
The resulting video was unusual. The “embryologist” literally planted his face in the precious dish, which clearly indicated that it had been generated by AI. The embryos in these dishes are worth tens of thousands of dollars.
There was also all the nonsense text around the clips, like “Brakfoots of tecmofolitgy” and “Breaknctve tennology”.
Hair down in the lab? Absurd words? Definitely AI-generated.
The second video I produced was better, but still looked like it was machine generated. It’s all about context. I’ve never seen a fertility doctor with their hair down or with a stethoscope around their neck, and AI just doesn’t understand that. All the clips looked like medical footage.
I realized I needed to do it one clip at a time, so I made a list of some visual ideas. For example: “Show an embryo developing in the dish and reaching the blastocyst” (a critical moment in IVF).
It looked like a mass of cells, but it didn’t look 100% like an embryo.
It almost looks like an embryo.
I then edited my prompt, asking it to enlarge the bubbles and remove the light in the embryo.
While loading, I added a few more videos to the queue.
“Create a video of female reproductive anatomy to use in an explanation.” Like the embryo above, it also didn’t seem scientifically accurate (cue it was a fake video) and it kept making unusual spelling mistakes, such as “uterus.”
I had to go back and remix clips, like directing Sora to “make him look more clinical.” It was frustrating at times, especially because of the bad spelling.
Where are the ovaries?
At this point, Sora seems incapable of producing actual text and certainly doesn’t know what female anatomy looks like.
Nonsense language and extra stuff added to the reproductive system: definitely AI generated.
I stepped away from science to see if Sora would do better at producing a cute baby video. I asked for “a close-up of a newborn, with golden light and purity.”
And finally, we got there.
This newborn looked precise.
I continued this theme by asking for clips of a newborn’s adorable feet.
I asked Sora to remove the baby’s face and zoom in on the little feet, but it didn’t happen. He also added or subtracted the usual number of toes in several clips of the AI baby’s feet (these tools difficulty with fingers and toes).
These babies have too many or too few toes.
Then I asked a pregnant woman holding her belly. This one worked, and he even put his hands on correctly.
Sora’s pregnant generation looked decent, if a little lumpy.
Then I asked for a table with all the IVF medications and needles spread out on it. Small vials might be workable, but the black liquid in the needles? Yeah.
At first glance, these AI-generated IVF drugs seemed accurate. Then I noticed the black liquid in the needles.
For reference, this is what my IVF medications looked like.
My current IVF medications and needles.
To continue testing her abilities, I prompted Sora to create a video of news headlines about IVF and fertility. He struggles with specificity like this.
I don’t even know what language he uses.
What is this AI language?
For one last photo, I asked for a baby growing in a woman’s belly, but it just showed the belly and added more (disproportionate) hands. I thought we were past the problem of extra members.
Additional fingers and hands featured in these AI-generated videos by Sora.
I didn’t have much luck with Sora, so I opened ChatGPT for some quick ideas for my video, and it just got weirder.
ChatGPT recommended asking Sora:
- “An egg floating in a galaxy, symbolizing creation and possibility.”
- “A glass flower garden depicting embryos growing in a laboratory.”
- “A digital twin of the human reproductive system composed of data streams and light.”
- “Futuristic fertility lab run by AI robots, minimal sterile white environment, cinematic lighting.”
- “Butterfly emerging from a test tube, symbolizing hope and transformation.”
You can’t even wear perfume or use scented shampoo on egg retrieval and embryo transfer days, so the idea of a glass of flowers in an IVF lab is ridiculous. And most of the suggestions were a little too metaphorical to be useful.
It’s time to call him. I downloaded the few passable videos.
The four least bad music videos generated by Sora.
The verdict on Sora’s mistakes
While I didn’t create a cinematic masterpiece with Sora, I did get some decent clips that I can use in my first explainer style video. This video will be a mix of me speaking to camera, historical footage from our IVF journey, archival videos and these AI clips.
Would I use Sora again? Sure, but I’ll wait for the new version to see if it’s better. This first attempt seemed to completely miss the point. Using AI to visualize my IVF story has only exposed the technology’s serious blind spots.
And for a story as delicate as my fertility journey, there should be no errors or inconsistencies.