At Home with Tech

Unlock the power of all your technology and learn how to master your photography, computers and smartphone.

Category: Tech in the News

Should You Clone your Voice to Help Preserve your Legacy?

With a little help from my recently cloned voice, I asked ChatGPT about the personal value of voice cloning. I then converted the AI response into an audio conversation between ChatGPT and my virtual self. My resulting podcast featuring the cloned me is below. 

Thanks to the rapid evolution of AI technologies, anyone can now clone their own voice and generate a reasonable duplicate through text-to-speech software.

The Benefits and Risks of Voice Cloning

If you market your voice professionally, then cloning your voice could bring certain benefits as well as inevitable risk. (Who needs to pay you for your actual voiceover work when a good AI copy will do?)

And of course, this topic also brings ethical concerns regarding unauthorized use.

But for most of the population who are hopefully not on the radar of bad actors, I think about whether there’s any value to cloning your voice. How might that help you in your journey through life… or beyond?

Preservation of your Legacy

One benefit could simply be the preservation of your own voice for legacy purposes, much like the value an old family photo for archival use.

On the other hand, wouldn’t that be a little creepy for a family member to be able to generate more of your voice after you’re gone?

Ask ChatGPT

So, I decided to interview ChatGPT to delve into this issue. For the purposes of this exercise, I first cloned my own voice using “Instant Voice Cloning” from Eleven Labs, the software company that offers natural-sounding speech synthesis. I then assigned an Eleven Labs’ virtual voice to play a fictional ChatGPT expert. 

Finally, we were ready for our little chat. Here’s our audio interview which I created from the ChatGPT-generated transcript. The under four-minute podcast features both my real voice and my cloned voice created through text-to-speech AI. (My previous podcast episode was all me.) And remember, everything about my guest was created by ChatGPT, including her name.

I think the results successfully bend reality…

Can You Find the Real Barrett?

Will Apple Vision Pro Give Us the Future We Expected?

Is Apple’s spatial computing launch leading us to this vision of tomorrow? (Thanks to Adobe Firefly’s generative AI for helping me to visualize.)

No, I’m not running out to buy an Apple Vision Pro. Not yet. Not this year. Not at this price point. Yet I couldn’t be more excited about it. Yes, of course I really crave this mixed-reality headset. And I know that eventually, I’ll be wearing one. And that makes me so happy.

Believing this likelihood helps me reaffirm the possibility that we’ll make it to the promised future one day. Sometimes it still feels so 20th century.

1980, 1999 and 2001 are foundational science fiction dates that reality couldn’t live up to. We don’t have flying cars or undersea cities yet. Electric cars aren’t quite mainstream. We’re even having a hard time getting ourselves back to the moon without crashing.

Sure, I know that remarkable technological innovations do permeate throughout humanity every year. I sometimes just don’t feel it so much on a Monday morning.

Apple Changed my Life
You can say what you want about Apple as a marketing machine and its amazing ability to create an uncontrollable Pavlovian response for each of its new product lines. But its past shiny gear from the future did revolutionize how we computerized and accessorized.

Apple delivered big time.

Now, my Apple tech feels quite normal, and I’ve forgotten that I once existed without my Mac Studio, iPhone 15 Pro Max, Apple Watch, iPad and AirPods.

I couldn’t imagine how to live without these devices, unless I chose to be off grid and banish myself to a tech-free isolation. (Luddites may form a line on the left to debate me on this.)

I need my Apple gear.

The Era of Spatial Computing Begins
Will the Vision Pro eventually become a must-have device too? Well, that’s the question today. The entire VR/AR category has been struggling to go mainstream for years. Maybe rebranding it to ‘spatial computing’ will help. (hmmm)

The Vision Pro won’t benefit quite as much from the FOMO factor. It doesn’t seem so portable to easily transport around for others to see you wearing (even though an Apple commercial demonstrates a happy woman wearing one on a plane).

It’s probably going to be a while until I’m surrounded by an army of Apple Vision Pros on a city street the same way I once experienced hundreds of AirPods orbiting and taunting me while I walked to work.

Thousands of people moving together on the streets of New York City, wearing Apple Vision Pros and experiencing augmented reality.

Now, that’s a vision of the future.

Affordability is Relative
I know today’s Apple Vision Pro is not perfect. It’s version 1. But the reviews I’ve read all agree it’s a huge leap forward compared to past headsets.

Of course, it is. That’s what Apple usually does.

And I know Vision Pro is only going to get better, and hopefully less expensive than its current $3,499 starting cost. Apple isn’t exactly known for dropping its prices, other than creating parallel products with older tech (iPhone SE).

On the other hand, how many thousands of dollars did many of us fork out for those early plasma HDTVs?

And remember that Apple Vision Pro is also a complete standalone computer… not just a mixed-reality headset.

Still, the price point is undeniably a limiting factor. And Apple must know this.

The Future has Arrived
I couldn’t be more excited about a product that I’m not buying, and I expect that I’m not alone.

I’m sure that Tim Cook has a plan to make Apple Vision Pro the next iPhone. And something tells me that V1 is all part of a long-term plan to draft me into the Vision Pro ecosystem.

It’s just a matter of time until I’ll be wearing the future on my face.

Borg Barrett is ready to be assimilated. No resistance from me.

And I’ll be smiling.

How to Use AI to Easily Improve your iPhone Photography

AI can effortlessly and perfectly select the people in your photos to individually brighten and edit. Here’s how to access this superpower using your iPhone and Adobe Lightroom.

I don’t travel about taking family photos with my own team of lighting professionals and a heavy bag of prime lenses (I wish). I typically just use the camera baked into my smartphone, which as you know is my trusty iPhone.

Sure, I sometimes get more ambitious and bring my GoPro, my Panasonic Lumix LX-10 or Lumix ZS200 with its bigger zoom. But my iPhone 15 Pro Max has a great camera system. And best of all, it’s always with me.

So, like the rest of us, most of my photography is generated through my phone.

When the Light is your Enemy
Even though my iPhone’s camera skills are admirable, the world usually doesn’t present perfect conditions to capture an optimal photo. Often, the lighting is not quite right.

Your subject can often look dark. Sure, my iPhone can sometimes handle this challenge. But it has problems (as does any camera) when my subject isn’t as well-lit as other parts of the frame. A similar limitation develops when the background is too bright (such as when your subject stands in front of a window with sun pouring in).

Then you’ve got yourself a silhouette shot, which is the opposite of what you probably wanted.

Yes, you can try to reframe, but that’s not always possible. The only option is to snap the photo and then try to fix it in post.

The Former Limits of Photo-Editing Solutions
There are any number of photo editing software options where you can brighten your photo to pump up how your subject appears. (Your smartphone will do this in one click.) But that can often start to overexpose the other parts of your image that are already sufficiently bright.

Professional photo editing programs can enable you to just select a portion of your photo to enhance, but there’s not been a one-click solution… until recently.

How to Brighten the People in your Photos using the AI in Adobe Lightroom
I use Adobe Lightroom Classic to organize and enhance my photos. The software now offers the ability to perfectly isolate and select people in photos with just one click. Being able to accomplish that used to take years of training and practice with complex software.

But with the power of Adobe Sensei AI, Lightroom does all that for you. Then you can easily pump up how the people in your photo look.

Here’s how:

  • In Develop mode, click on the circular Masking Tool on the top right. That’s your entry point.

On the top of your options, there are three boxes you can click to select:

  • Subject
  • Sky
  • Background

The AI-powered Masking Tool immediately isolates a perfect cut out and adds a mask that you can brighten, darken or adjust in any number of ways. If there are several people in your photo, and you want to enhance the look of just one, you can click on ‘People’ to select that individual.

It’s amazing.

Two Examples of Lightroom’s Masking Tool in Action
Here’s one example of using the Masking Tool to pump up the light and color saturation of just the two people in my shot overlooking Exit Glacier in Alaska’s Kenai Fjords National Park.

Brighten the People

And here’s another example where I used the Masking Tool to brighten this somewhat hidden young moose I spotted while biking near Anchorage.

Brighten the Moose

It’s not a perfect shot but being able to actually see the moose more clearly with the help of AI certainly improves it.

  • A warning: Clicking-in more light onto your subjects should be a subtle enhancement. Otherwise, it will look fake. So, sprinkle in your extra light sparingly.

Add Buttery Bokeh Blur Using your iPhone’s Portrait Mode
Once, you’ve got your photo subjects properly ‘re-lit,’ then you can focus on editing the backgrounds in your shots. A relatively new AI trick you can use is adding in background blur or ‘bokeh.’

This nifty visual effect used to be achievable only while taking photos with a more traditional camera in bright light using the right lens and aperture setting. Now you don’t have to be an expert photographer to get some bokeh. AI can create the same effect in post!

An iPhone camera’s Portrait Mode setting is designed to do exactly that. The iPhone’s software in the Photos app isolates the background from your subject, allowing you to dial in your background blur. You can snap away and then later choose to add bokeh (as long as the photo was originally taken in Portrait Mode).

This feature has been a game changer for me.

  • Another warning: You might want to dial back the amount of your iPhone’s auto bokeh level setting (Yes, you can do that.). Sometimes, just a subtle background blur is all you need. Too much may make the background look like it’s a complete digital replacement.

Three Levels of Bokeh
Here are three examples adding different levels of bokeh in Portrait Mode on my iPhone. I took this selfie while I was shopping for a new pair of reading glasses. You don’t need to see the optometrist office background. So, I thought it was a perfect opportunity to blur it out. But how much bokeh is the right level? You decide…

Lens Blur in Lightroom Classic
Adobe Lightroom Classic can perform the same bokeh trick with its new Lens Blur feature. In one click, you can create a depth map of your photo using Adobe Sensei. From that point, you can tinker further to adjust the scope of the blur.

Is It Cheating to Use AI to Improve your Photos?
The technology to digitally adjust your photos has been around for years. But some of the tricks were complex to pull off. The big change now is AI can do much of the same work for you with just a few clicks.

Should you feel like all of this is somehow cheating? Are you not really a good photographer, because you couldn’t originally capture your shot perfectly, and you need AI to save you?

Please.

If you’re a Luddite, maybe. Otherwise, this is simple technological progress.

Time to get on board and use some AI-oomph to make your photos shine brighter!