At Home with Tech

Unlock the power of all your technology and learn how to master your photography, computers and smartphone.

Should You See “The Wizard of Oz at Sphere” When Visiting Las Vegas?

If you enjoy watching movies on an IMAX screen, or if you’ve ever been to a 4D theater and think that’s cool, then you’d better hold onto your hat when walking into Sphere in Las Vegas.

I recently visited this enormous venue for the first time while I was attending the NAB conference, and I was not disappointed. In fact, I was blown away (almost literally). Sphere was quite the immersive experience.

We’re Off to See the Wizard

I saw “The Wizard of Oz at Sphere,” reimagined for the massive 160,000-square-foot screen and 17,600-seat dome-shaped arena. The movie’s run time was slimmed down to 75 minutes, but everything else was supersized.

This new 2025 version uses cutting-edge tech to greatly extend the 1939 dimensions for each scene both on screen and off. However, the creative process also revealed certain technological limits with having to scale up the movie’s main characters to insane proportions.

GenAI Enhanced

In a few closeups, I noticed that Judy Garland’s face had that Princess Leia artificial CGI vibe from “Rogue One: A Star Wars Story.” Clearly, there was some GenAI power at work here. 

And I was right. I later learned while attending a Google keynote at NAB that Google DeepMind partnered with the visual effects artists to tackle many of the hurdles involved in creating ‘super resolution’ imagery as well as entirely new elements.

But what I didn’t realize while watching the movie were the multitude of ways that GenAI magic had also been deployed to flawlessly do the impossible. And it was usually so good, I didn’t notice it was even happening.

Sure, it may be a challenge to extend the background elements like the haunted forest or the yellow brick road. But it’s an entirely different hurdle having to visually lengthen the actors’ performances. And that had to happen, because the wider screen size required characters to be in the frame where they weren’t originally.

Performance Generation

So, to create something from nothing, they used ‘AI character outpainting,’ which morphed original medium shots and cropped bodies into full-body shots.

But that wasn’t all of it. 

Since the field of view was so much wider on the wraparound screen, more characters suddenly had to remain in particular shots, even though they were framed out in the original scenes. So, those performances needed to be ‘generated.’

And it often felt entirely natural. (Occasionally, there was resulting blurring for background characters, but I expect most viewers wouldn’t notice.)

Inside the Tornado

Each of the movie’s old 4:3 frames had to be broken down, reconstructed, digitally extended and then sometimes reimagined.

When you begin to recognize the process, effort and wizardry applied here, the results become even more impressive.

And then there were certain scene enhancements that were hardly subtle. They’re blatantly reengineered to knock your socks off with everything that Sphere can throw at you.

I’ll mention the tornado sequence and say no more, other than it was worth the price of admission right there. 

Incredible.

400 Section Seats are Fine

Speaking of the price of tickets, I sat in the higher ‘400’ center section. Though my seat was relatively far away from the more expensive seating, it was absolutely fine for this film. 

If anything, the higher vantage point gave me a better view to appreciate all the expanded visual elements stretching over my head.

I got the same incredible 4D and audio experience as the lower section seats. Sure, I may have missed out on (spoiler alert) a falling apple or two, but that’s okay.

Impressive, Most Impressive

I bought my ticket to “The Wizard of Oz at Sphere” at the last minute. So, I didn’t have time to do research about how they re-made this movie. That’s probably for the best, because I didn’t know exactly what to look for. I simply experienced it.

Sure, I still spotted some of the limitations from 2024-2025 AI tech, but now, I also recognize and appreciated how far these filmmakers stretched to create this.

Ultimately, I am floored by the overall achievement. 

And just visiting Sphere by itself was a total experience. (It didn’t really matter what was playing.)

There’s a mischievous design cleverness that’s displayed throughout the entire space, beginning from the moment you walk into the lobby.

And after I sat down in my seat, I must admit I was fooled by the preshow illusion, an homage to Radio City Music Hall in New York City.

So, if you’re in Vegas, and you want to take in a show, I absolutely recommend you head over to Sphere.

You’ll quickly discover you’re not in Kansas anymore.

How to Turn Your Photo into a Cinematic Video Clip Using GenAI

Have you ever encountered a visual moment that you immediately felt compelled to capture with a photo? It happens to me all the time. Taking a walk on the street… hiking in the forest… or commuting to work. I just see it and say to myself, “Wow, that would be a great picture.” 

I imagine you’ve experienced this too.

One Frame is All You Need

And whether you happen to be carrying your camera or can quickly pull out your smartphone, it’s often possible to capture the moment. But the visual opportunity usually lasts for just a few seconds, right? You might get off a couple shots, but then the opportunity has passed.

If you’re also attracted to the motion of the moment and you want to shoot a little video clip, that’s usually a much harder task, because there simply isn’t enough time.

That’s where GenAI video creation tools can really help. No, they can’t turn back or slow down time, allowing you the freedom to shoot that perfect video. 

But GenAI can take your photo and then magically generate a few seconds of realistic action from your freeze frame. 

A Picture is Worth a Thousand Words

This faux reality is clearly not the same as recording the real deal.

But if you consider the potential of this amazing and disruptive AI video creation process as an art form, you can sidestep (for the moment) the ethical debate surrounding fake video use.

Today, we’re simply talking about using your very real photos as the creative foundation to generate video art. Think of it as a realistic-looking digital painting in motion.

And instead of having to start from scratch with just words and a series of complex prompts, your photo can instantly provide a ton of art direction for your scene. From there, you can focus more on creating the action in it.

Here’s an example of how I do this…

Start with One of Your Photos

Upload your photo to the GenAI platform of your choice. For today’s demonstration, I’m using Google Veo. I snapped this photo of a man flying a kite on the beach on a cool spring afternoon. 


Animate your Photo

Then I asked Veo to bring it life.


Change the Person in your Photo

Next, I asked Nano Banana to change the man into a woman.


Create a Cinematic Video

Finally, I had Veo create a video from that new photo and add a tracking dolly move to the shot.

This cinematic motion added to the scene was especially impressive, because that would have otherwise required a video crew for the day with expensive gear to pull off.

I generated this video clip in about 30 seconds.

Impressive. Most impressive.

AI Video Creation Tip:
Use a Start and End Image

Often, it’s easier to get what you’re looking for if you first upload what the first and last frame of your new video should look like (using two of your photos). 

That will provide the guardrails for your Gen AI platform to appropriately fill in the middle. Otherwise, it might do anything it wants, creating entirely unwanted action. Even uploading the same photo as the first and last shot can help stabilize the output.

Perfect results are not guaranteed, but I’ve found it captivating to explore this process. Here are a few more examples based on my own photos.

Woman Sits on Church Steps in Barcelona

GenAI video


Man and his Dog Walk to Waterfall

GenAI video


This is Serious

So yes, this is really cool. But what’s my practical application for all this beyond showing off a parlor trick of sorts?

Well, clearly, this is all about the future of video creation. And GenAI video will only continue to become more realistic. Again, beyond the ethical questions surrounding this topic, there are massive implications that are already rewriting the rulebook for the entire video production industry.

And as video production sits at the center of my career, I’ve been paying close attention to this revolution.

It’s easy to play in this sandbox in the name of art or personal photography. But there’s a much larger arena that’s already being impacted.

Soon, playtime will be over.

It’s time to skill up. 

Why I Couldn’t Stop Watching the Final Hours of NASA’s Artemis II Mission

I spent Friday night glued to my TV watching NASA’s feed of Artemis II’s Orion capsule reentry and splashdown. I tuned in a half hour before Integrity reached the edge of Earth’s atmosphere and those six terrifying blackout minutes. Then, the multiple parachute deployments and splashdown! I stuck with the feed all the way till the two helicopters dropped off the four astronauts onto the flight deck of the retrieval ship.

My several-hour experience felt something like streaming a movie. As I got up from my couch and turned off my TV, I reflected on my Friday night flick.

One the one hand, it might be one of the more boring movies I’ve ever seen. There were long stretches where nothing really happened, and then the audio feed went entirely silent. 

On the other, it was perhaps the most stunning and gripping event I’ve ever watched.

Because this was no movie. This was reality.

Houston, Will We Have a Problem?

This was real drama. A space capsule hurtling into the atmosphere at almost 25,000 mph. Will the heat shield hold? Will all the parachutes unfold? Sure, we heard optimistic audio commentary from mission control. But anyone could read the room and understand the clear risks.

Forget special effects… Did you see that real-time video shot from Integrity’s window as the scorching plasma ring began to envelop the ship. Whoa!

And that spotter plane’s unbelievable broadcast feed that followed Integrity plummeting downward from 100,000 feet with no net before the parachute phase began. Such a crazy, cool shot.

If I had walked into our family room at that moment, I would have certainly asked what sci-fi movie was playing and commented on how realistic the special effects looked.

Better than Any Movie

That’s so ironic, because our visual understanding of space travel for the past half century has been primarily informed by Hollywood. So, when I’m exposed to the real thing, it’s entirely jarring.

And let me tell you, reality can still run rings around Hollywood.

Sure, every shot may not be as ‘cinematic.’ Some of the imagery was shaky and blurry. It’s raw. It’s real. And then other shots locked in perfectly to blow you away.

That chopper shot that showed the first helicopter returning the astronauts… It was a silky-smooth tracking shot over the water at golden hour… better than any movie. 

Then, there was the Navy ship’s robotic camera coverage that documented the helicopters touching down onto the ship’s deck… Yes, it was a little jerky, but also stunning.

Is This a Rerun?

You may feel like you’ve seen this all before, but you haven’t… not in real time. It’s one thing to watch a History Channel documentary on the space race. It’s entirely another experience not to know how the Artemis 2 mission ends.

And if you’ve only been peripherally paying attention to an admittedly not so compelling NASA program since the Space Shuttles, you might be blurring facts with fiction.

Don’t we already have a moonbase? No, that’s from “2001: A Space Odyssey.”

How about Mars? No. Except for some robot rovers and an unlikely little helicopter, all the imagery you might recall is from Hollywood.

Even if you’ve got your facts straight, you may not feel the realty. So, spending a little time watching reality offers an important reminder that space travel is difficult and dangerous… and the real deal is always remarkable.

Powerful Storytelling

Nobody has experienced a manned Moon mission since 1972 and Apollo 17.  And the last act is always a doozy.

This little capsule-turned-into-a-fireball thing as it races through the atmosphere felt very “Project Hail Mary” to me.

I know NASA knows what it’s doing, and we’ve returned from the Moon before, but NASA has clearly evolved its storytelling game by simply sharing more of the visceral experience with viewers.

I think that has a lot to do with better camera tech. (More powerful lenses and advanced transmission technologies.) But it also demonstrates NASA’s commitment to better share its own story.

Class Dismissed… For Now

So, I watched each key moment as our Artemis II astronauts traveled from outer space to their big splashdown in the Pacific. And then I watched the Navy go to work. So impressive.

I cannot think of a better way to spend a Friday night. Sure, anyone can watch the 30-second recap the next morning. But it’s not the same. I got the full experience… the complete lesson on how it’s done.
(And I highly recommend it.)

I am inspired. I feel like a kid again.

Thank you, NASA. And congratulations.