You Can Now Make Google Messages Blur NSFW Images

If you receive nude photos via Google Messages, your Android device will soon detect and blur them with a sensitive content warning before you can view them. The feature, which is being rolled out to all users, also allows you to block numbers that send NSFW images and alerts you before you can send or forward them.

When sensitive content warnings are turned on, users will be directed to a resource page (tap Learn why nude images can be harmful) and see an option to block the sender’s number. You can choose to open the image you receive (tap Next > Yes, view or No, don’t view) and unblur and re-blur by tapping the Preview icon.

How to enable Google’s sensitive content warnings

Sensitive content warnings are opt-in for adults, so you’ll need to enable the feature in your Google Messages settings under Protection & Safety > Manage sensitive content warnings. Flip the toggle next to Warnings in Google Messages.

For Android users under 18, sensitive content warnings are on by default. Those with supervised accounts cannot control or turn the feature off by themselves, though parents can do so in the Family Link app. Unsupervised teens ages 13–17 can disable the feature in their Google Account settings.

According to 9to5Google, this feature is currently available only on some Android devices running the Messages beta, so it may not appear in your settings yet. Sensitive content warnings do not currently work on videos.

All detection happens on-device powered by Android’s SafetyCore, meaning no identifiable data or classified content is sent to Google servers.

You Should Try Instagram’s New ‘Blend’ Feature for a Custom Reels Feed

Instagram has a new feature that curates custom Reels feeds for you and your friends. Blend, an invite-only option within individual or group chats, refreshes daily and suggests content based on participants’ tastes.

Spotify has a similar Blend feature that creates shared playlists—also updated daily—based on both listeners’ tastes. Note that Instagram Blend, which is a mobile-only feature, is not yet available to all users, even if the icon appears in your chats.

How to use Instagram Blend

To start a blend, you’ll need to invite users via your individual or group chats. From Messages on the mobile app, open the one-on-one or group chat you want to create a blend with and tap the new Blend icon. Everyone in the chat will receive an invite—if at least one person accepts, a blend will be created, but an individual’s suggested reels will be added only if they accept.

Once a blend is created, you can view it by tapping the Blend icon at the top of the chat, where you can then comment on or react to it. You can delete a reel that has been suggested for you by tapping the three horizontal dots and selecting Remove from your blends, though this will remove it from any blends you are part of. If you want to leave a blend, open it and tap the Settings icon > Leave this blend.

According to Instagram’s explainer page, you can also curate suggestions by indicating whether you are interested or not interested (tap the three horizontal dots on the reel) to get more or less of similar content. Sensitive content will be filtered based on the member with the strictest settings.

People Are Reverse Location Searching Photos on ChatGPT, and It Actually Works

This week, OpenAI announced its latest models: o3 and o4-mini. These are reasoning models, which break down a prompt into multiple parts that are then addressed one at a time. The goal is for the bot to “think” through a request more deeply than other models might, and arrive at a deeper, more accurate result. 

While there are many possible functions for OpenAI’s “most powerful” reasoning model, one use that has blown up a bit on social media is for geoguessing—the act of identifying a location by analyzing only what you can see in an image. As TechCrunch reported, users on X are posting about their experiences asking o3 to pinpoint locations from random photos, and showing glowing results. The bot will guess where in the world it thinks the photo was taken, and break down its reasons for thinking so. For example, it might say it zeroed-in on a certain color license plate that denotes a particular country, or that it noticed a particular language or writing style on a sign.

According to some of these users, ChatGPT isn’t using any metadata hidden in the images to help it identify the locations: Some testers are stripping that data out of the photos before sharing them with the model, so, theoretically, it’s working off of reasoning and web search alone. 

On the one hand, this is a fun task to put ChatGPT through. Geoguessing is all the rage online, so making the practice more accessible could be a good thing. On the other, there are clear privacy and security implications here: Someone with access to ChatGPT’s o3 model could use the reasoning model to identify where someone lives or is staying based on an otherwise anonymous image of theirs. 

I decided to test out o3’s geoguessing capabilities with some stills from Google Street View, to see whether the internet hype was up to snuff. The good news is that, from my own experience, this is far from a perfect tool. In fact, it doesn’t seem like it’s much better at the task than OpenAI’s non-reasoning models, like 4o.

Testing o3’s geoguessing skills

o3 can handle clear landmarks with relative ease: I first tested a view from a highway in Minnesota, facing the skyline of Minneapolis in the foreground. It only took the bot a minute and six seconds to identify the city, and got that we were looking down I-35W. It also instantly identified the Panthéon in Paris, noting that the screenshot was from the time it was under renovation in 2015. (I didn’t know that when I submitted it!)

o3 correctly guessing locations

Credit: Lifehacker

Next, I wanted to try non-famous landmarks and locations. I found a random street corner in Springfield, Illinois, featuring the city’s Central Baptist Church—a red brick building with a steeple. This is when things started to get interesting: o3 cropped the image in multiple parts, looking for identifying characteristics in each. Since this is a reasoning model, you can see what it’s looking for in certain crops, too. Like other times I’ve tested out reasoning models, it’s weird to see the bot “thinking” with human-like interjections. (e.g. “Hmm,” “but wait,” and “I remember.”) It’s also interesting to see how it picks out specific details, like noting the architectural style of a section of a building, or where in the world a certain park bench is most commonly seen. Depending on where the bot is in its thinking process, it may start to search the web for more information, and you can click those links to investigate what it’s referencing yourself.

Despite all this reasoning, this location stumped the bot, and it wasn’t able to complete the analysis. After three minutes and 47 seconds, the bot seemed like it was getting close to figuring it out, saying: “The location at 400 E Jackson Street in Springfield, IL could be near the Cathedral Church of St. Paul. My crop didn’t capture the whole board, so I need to adjust the coordinates and test the bounding box. Alternatively, the architecture might help identify it—a red brick Greek Revival with a white steeple, combined with a high-rise that could be ‘Embassy Plaza.’ The term ‘Redeemer’ could relate to ‘Redeemer Lutheran Church.’ I’ll search my memory for more details about landmarks near this address.”

o3 having trouble identifying a location

Credit: Lifehacker

The bot correctly identified the street, but more impressively, the city itself. I was also impressed by its analysis of the church. While it was struggling to identify the specific church, it was able to analyze its style, which could have put it on the right path. However, the analysis quickly fell apart. The next “thought” was about how the location might be in Springfield, Missouri or Kansas City. This is the first time I saw anything about Missouri, which made me wonder whether the bot hallucinated between the two Springfields. From here, the bot lost the plot, wondering if the church was in Omaha, or maybe that it was the Topeka Governor’s Mansion (which doesn’t really look anything like the church).

It kept thinking for another couple minutes, speculating about other locations the block could be in, before pausing the analysis altogether. This tracked with a subsequent experience I had testing a random town in Kansas: After three minutes of thinking, the bot thought my image was from Fulton, Illinois—though, to its credit, it was pretty sure the picture was from somewhere in the midwest. I asked it to try again, and it thought for a while, again guessing wildly different cities in various states, before pausing the analysis for good.

Now is not the time for fear

The thing is, GPT-4o seems to be about even with o3 when it comes to location recognition. It was able to instantly identify that skyline of Minneapolis and immediately guessed that the Kansas photo was actually in Iowa. (It was incorrect, of course, but it was quick about it.) That seems to align with others’ experiences with the models: TechCrunch was able to get o3 to identify one location 4o couldn’t, but the models were matched evenly other than that. 

While there are certainly some privacy and security concerns with AI in general, I don’t think o3 in particular needs to be singled out as a specific threat. It can be used to correctly guess where an image was taken, sure, but it can also easily get it wrong—or crash out entirely. Seeing as 4o is capable of a similar level of accuracy, I’d say there’s as much concern today as there was over the past year or so. It’s not great, but it’s also not dire. I’d save the panic for an AI model that gets it right almost every time, especially when the image is obscure.

In regards to the privacy and security concerns, OpenAI shared the following with TechCrunch: “OpenAI o3 and o4-mini bring visual reasoning to ChatGPT, making it more helpful in areas like accessibility, research, or identifying locations in emergency response. We’ve worked to train our models to refuse requests for private or sensitive information, added safeguards intended to prohibit the model from identifying private individuals in images, and actively monitor for and take action against abuse of our usage policies on privacy.”

How to Make Peanut Butter in the Vitamix Ascent X5

We may earn a commission from links on this page.

I typically hesitate to suggest making peanut butter in blenders, but the Vitamix Ascent X5 is the exception. It’s the best blender I’ve ever tried, and you should definitely make nut butters with it. Not only is it the absolute fastest nut processor I’ve ever tested (under a minute in some cases), but the resulting peanut butter is incredibly smooth. Here’s how to make any nut butter in the Vitamix Ascent X5.

How to use the nut butter function on the Vitamix

1. Prepare the nuts, if needed

I like to roast the nuts in the oven first, because I love the sweet, toasty flavor. To do it, spread out your nut of choice (I used walnut pieces today) on a baking sheet in a single layer, and put the sheet pan in a 350°F oven for five to 12 minutes. This time will vary depending on the nut, the size of the pieces, and thickness of the sheet pan, so use your nose as your guide. As soon as you start to smell toasted nuts, give them a stir with a wooden spatula and assess if you need a few more minutes. They should be golden with no burnt areas. Allow the nuts to cool on the sheet pan for at least 10 minutes. If you prefer raw nut butters, though, skip this step and start at step two.

2. Add the nuts to the Vitamix container

Hand unplugging stopper from blender lid.

Credit: Allie Chanthorn Reinmann

Pour the nuts into the container and snap the lid on, but keep the middle plug lined up with the notches (see picture) so it’s easy to un-stopper. You’ll need to use the tamper to assist the nuts (the tamper is that black plastic baseball bat-looking thing that came with the Vitamix Ascent X5), smashing them down toward the blade during the one minute blend time. It’s best to have it right next to the blender—the preset timer goes by before you know it. 

3. Select the nut butter preset 

Hand selecting a setting on the Vitamix blender.

Credit: Allie Chanthorn Reinmann

Press the button on the left above the knob that has three horizontal lines. The screen will display several cute little cartoons of food. Rotate the knob until you see a jar with a peanut inside. Press the play button directly to the right of the knob.

4. Tamper for your life

A plastic tamper pressing nut bits in a blender.

Credit: Allie Chanthorn Reinmann

Once the engine kicks on, the initial kick-up of nut particles will pass after a second or two. Then you can unstopper the center plug and use the plastic tamper to jam the nuts down into the blades. The engine will kick up another speed level and you’ll hear it, but don’t let it stop your tampering. If you’re ever in doubt of when to start pressing, you can look at the screen. There’s a little tamper icon with a downward arrow signaling that tampering is necessary.

The tamper icon on the blender control display.
The tamper icon appears to the left of the timer.
Credit: Allie Chanthorn Reinmann

For a chunky butter, stop the machine after 30 or 40 seconds by pressing the same start/stop button to the right of the dial. Unplug the machine and stir the mixture to see if it’s the consistency you like. If you like a smoother nut butter, let it run for the full minute. To get an even silkier nut butter, you may want to add another 15 seconds by pressing the “+:15” button, which is on the right side above the rotating dial. But generally, once the initial minute is up, you have nut butter. That’s all it takes.

Walnut butter with rum and raisins in a blender container.

Credit: Allie Chanthorn Reinmann

If you want to get a little creative, you can add a few fun ingredients. I decided to make Rum Raisin Walnut Butter today, so after the first nut butter blend finished, I added some dark rum, raisins, a touch of honey, and a bit of salt. I selected the nut butter preset again, did some tamping, and stopped the blend after about 20 seconds to incorporate the new ingredients and smooth out my walnut butter. Of course, you could keep yours low key with just a pinch of salt and some cinnamon, for example.  

Once I emptied my walnut butter into a glass container, I added warm water and a drop of soap to the container. (Don’t go higher than half full, the blender will swish the water aggressively.) Click the little button in the center with the image of bubbles and let the blender clean itself. This setting is mostly meant to get the blades clean, so don’t get frustrated if some peanut butter smears are still hanging out on the sides of the container. Just bring it over to the sink and carefully wash the sides.

Walnut butter on an English muffin.

Credit: Allie Chanthorn Reinmann

If you’ve never made walnut butter before, I highly suggest it. This spread turned out buttery-smooth with a salty and bitter edge, and it makes an excellent partner for jam. You can also increase the honey or raisins if you want it to be sweeter. 

Rum raisin walnut butter recipe

Ingredients:

  • 350 grams (3 cups) walnut pieces

  • 35 grams (1/4 cup) raisins

  • 1 tablespoon dark rum

  • 1 teaspoon honey

  • ¼ teaspoon salt

1. Roast the walnut pieces in a 350°F oven for five to 12 minutes, or until lightly browned and fragrant. Allow to cool to room temperature for at least 10 minutes.

2. Pour the walnut pieces into the Vitamix Ascent X5, add the lid, and select the nut butter setting. 

3. Press start, remove the central stopper on the lid and use the tamper to press the nuts down into the blades.

4. With the machine off, add the raisins, rum, honey, and salt. Put the lid back on and start the nut butter setting again. Tamper the mixture as it blends for another 20 seconds. Stop the machine. Scoop the nut butter into a container with a tight-fitting lid. Enjoy your new nutty spread on toast, with cake, with yogurt, or over granola. 

IN CASE YOU MISSED IT: Secretary Rollins Visits Pennsylvania to Continue Work to Put Farmers First

(Washington, D.C., April 18, 2025) – This week, U.S. Secretary of Agriculture Brooke Rollins visited Central Pennsylvania where she toured Talview Dairy, participated in a farmer roundtable at Martin’s family farm, and visited the Bank of Bird-in-Hand mobile bank to discuss agricultural lending with community leaders. Secretary Rollins was joined by Senator Dave McCormick, House Agriculture Committee Chairman GT Thompson, and Representatives Dan Meuser, Rob Bresnahan, and Lloyd Smucker.

Why the Treadmill Can Feel so Much Easier Than Running Outside

Is the treadmill harder or easier than running outdoors? Survey runners, and you’ll get plenty of different opinions on which feels harder or easier, but the basic physics of running are the same on both. (I promise.) So why do people who are used to treadmills find that they’re slower when they run outdoors? I’m going to run through the factors that are at play here, and talk about how to adapt if you want to be able to enjoy both. 

If you find treadmill running harder, you probably already know the reason: it’s boring. You have nothing to distract you from your own effort and the glowing numbers telling you how little progress you’ve made. This is a problem that we can train our brains to solve for us over time, whether with distractions, mindfulness, or simply being grateful that we’re not outdoors in the bad weather. 

For those who find treadmill running easier, the biggest reasons have to do with the environment (heat, hills, etc.) and with your mindset (especially your ability to pace yourself). Training both indoors and outdoors will help you to make the transition a little easier. Let’s dig into the reasons. 

The treadmill is not literally easier

Before we get into the relevant factors, I want to dispel a few myths. Physics-wise, running on a treadmill is pretty much identical to running outdoors in the same conditions.

The treadmill does not move your feet

The first myth we need to bust is the idea that the treadmill “moves your feet” and thus makes running easier. That’s not true. You have to spend just as much effort to stay in place on a treadmill going (say) say 6.0 miles per hour, as you do to move forward at 6.0 miles per hour on flat, steady ground. Running is the action of pushing off the ground to move yourself forward. In both scenarios, you are asking your muscles to push off with a force that will keep you moving 6 mph faster than the ground. 

(If the “treadmill moves your feet” theory were correct, would we not have to consider the rotating Earth its own sort of treadmill? And thus it would be 2,000 times harder to run west than to run east? Come on.)

You only need to add a 1% incline if you are running very fast

Then there’s the issue of wind resistance. Some runners will say you need to set the treadmill’s incline to 0.5% or 1% to mimic outdoor air resistance. Even on a calm day, your body has to push into the air to keep moving. Adding a small incline to the treadmill is supposed to mimic that extra effort.

But that is only true if you run at a pace of 7:30 per mile (8.0 mph) or faster. Below that, “the difference is so small as to be meaningless,” a scientist who studied the question told Runner’s World. So if you’re jogging at 6 mph, you don’t have to worry about accounting for wind resistance. 

Now that we understand the physics, let’s talk about why treadmill runs often feel easier than outdoor runs. 

Pacing

This is probably the biggest factor (aside from weather) in why outdoor running feels harder for a person who is used to treadmill running. On the treadmill, you decide on a pace—say 6 mph, as in our example above—and then your body knows what to do. 

But outdoors, you just have to run, and then figure out later what pace you’re going. Even if you have a watch that tells you your pace, it takes a few seconds to minutes to work out what that number is. (You also may not be used to reading minutes-per-mile pace if you’re used to seeing mph on the treadmill, which makes it even harder to know how fast you’re going.) 

So you, the treadmill runner, head off on an outdoor run without a good sense of how fast you’re going. Perhaps you end up going a little too fast, but you don’t realize it until it’s too late and you’re pooped. 

Meanwhile, outdoor runners will develop a sense of pace out of necessity. You have to listen to your body, not just look at a number, to know how hard you’re pushing. 

The good news is that it’s easy for treadmill runners to learn a sense of pace—all you have to do is run outside from time to time. You’ll learn what your body feels like when you’re going at an easy pace versus a harder pace versus an unsustainable one. It just takes a little practice.

Heat, humidity, and other weather conditions

If all your treadmill runs are inside of a 68-degree gym, they’ll all feel pretty similar. But the great outdoors is fond of blessing us with heat, wind, humidity, rain, snow, ice, and similar complications. 

Heat slows us down a lot, especially if we aren’t used to it. (You do build up some heat adaptation throughout the summer.) Humidity, in combination with heat, makes this even worse. Your body can’t cool itself as well through sweating, so you get hot and stay hot. It’s normal for your pace on a hot day to be anywhere from a few seconds to a few minutes slower each mile. 

Heavy winds can also slow you down (when you’re running into the wind) or speed you up (when you’re running with the wind at your back). Ice can make you slow down to watch your footing. Snow can make you work harder as your feet sink into the ground and you have to fight to pull them back up again. 

On the other hand, a cool, dry day is better for your performance than the conditions inside a sweaty gym. Perfect running weather is (in my opinion) around 50 degrees, calm, and overcast. Most people will run a lot faster and feel better in those conditions than on a treadmill at room temperature.

Hills

If you live in a pancake-flat part of the country, you can skip this section. But many of us live where there are hills. Big ones, little ones, maybe some mountains. On the treadmill, you get to choose what incline to run with. Outdoors, your choices may be limited. 

I live in a hilly place, so even my “flat” outdoor routes aren’t entirely flat. A regulation running track is my only truly flat option. Even gently rolling hills can add up over the course of a long run, making you work harder on the uphills without ever fully giving you that speed back on the downhills. 

Different surfaces

A treadmill only has one surface. Every step meets flat ground. Every step is the same softness or hardness. Outdoors, there’s so much more variation. 

Even on a simple city run, you’ll find yourself traversing curbs, slightly tilted sidewalk slabs, cambered edges of roads, pebbles, stray garbage, and occasional patches of grass or dirt. Take your run to the trails and you’ll also hit packed dirt, mud, soft grass, leaf litter, rocks, sticks, logs, little streams you have to hop over, ruts carved by mountain bike wheels—you get the idea. Your feet have to land and push off just a little differently for each of these. 

The variety in outdoor running is good for your feet, but it can be fatiguing on the small muscles of your feet and lower legs if you’re not used to it. 

How to train for an outdoor race if you prefer to run on the treadmill

It’s OK to do plenty of your training on a treadmill, and in some situations it may be necessary. The treadmill can let you get your training in when the weather is bad, when you can’t line up child care at your running times, or any of a number of other reasons. 

The important thing is to still run outdoors at least sometimes. If you’re training for a marathon, try to do your long runs outdoors, even if some of your shorter runs and speedwork have to be on the treadmill. Get outside when you can. That way you’re adapting to the weather, training your feet on different surfaces, and building the muscles and mindset necessary to tackle hills

Five Ways to Keep Your Neighbors From Looking Down Into Your Yard

We may earn a commission from links on this page.

We like to think that our home is also our fortress of privacy: Once you walk through your front door, you’re free from prying eyes (or lenses) and can relax. There are a ton of ways to ensure your privacy inside, starting with items like window films, shades, and curtains. And if you want to protect your outdoor space from neighbors peering over the fence, you have several easy options.

Most of those options, however, assume your neighbors are on the same level as you. If the neighboring homes are located above yours, you’ll need to re-think your privacy plan. This is especially true if your neighbors have outdoor areas that offer a perfect view of your backyard. If you get the sense you’re being watched every time you step outside your house, here are a few ways you can regain some privacy.

Plant trees

The simplest long-term option is to plant trees—specifically canopy trees, whose branches stretch out and form a living roof over the area below. You’ll want to consult with an arborist or other landscaping professional to identify canopy trees that will thrive in your region and climate. You’ll also want to get some advice on spacing out the trees so you eventually get the lush, thick “roof” effect that will offer the privacy you’re looking for (not to mention cool down the space and block out the sun’s damaging rays).

Note the word “eventually,” however: Trees don’t just pop up in a matter of days. Maple trees, for example, are excellent canopy trees, but can take several decades to reach their full height and size. But if you’re going to be in the house (and using the yard) for a long time to come, investing in canopy trees will solve your problem and beautify the space.

Install shade sails

A shade sail is typically a large triangular or rectangular sheet of fabric that you attach to posts or existing structures in your yard (such as a fence). They let air and rain through, but block out the sun, creating shade wherever you want it. That means they can be used for vertical privacy, too—one or more shade sails installed in your yard will block someone’s view of your yard from above.

Shade sails are relatively affordable and easily installed. You can buy a shade sail kit that comes with everything you need, but it’s easy to DIY by sinking a few posts or attaching some hooks to your fence or exterior walls. They can also be quickly removed if your circumstances change.

Add a covered patio

If you have a defined patio or deck space where you spend most of your time being observed by your neighbors (or imagining you are), you can cover it pretty easily using several options:

Build a pergola by the fence line

Speaking of pergolas, if your neighbor’s house looms over your yard, you can enhance your privacy by building a pergola near your fence line where their house is located. The angle will work in your favor; even a modest pergola can block much of your neighbor’s view if they’re not looking straight down into your yard. And if you have multiple neighbors up there, you can add a strong of pergolas along the fence line to block the angle everywhere.

You’ll have to check local regulations (or your homeowners association) before building. Some areas will require a pergola to be a set distance from a boundary fence (e.g., 16 feet), while others might have a calculation based on the size of the property and the height of your fence.

Invest in tall fencing

Finally, privacy fencing isn’t your best option for elevated neighbors, but if it is your only option for some reason, go big and install an extremely tall privacy fence. You’ll need to check local regulations to see if there are legal limits to how tall your fencing can be and then build or buy a fence at the maximum height (you can buy pre-made 12-foot and even 16-foot tall privacy fencing, though you’ll likely pay a steep price). As with pergolas near the perimeter, a tall enough fence can block your neighbors’ viewing angle. Another option is to grow a “living fence” using hedges or other plants (such as nigra arborvitae, which can reach 30 feet in height). You’ll have to wait for your plants to mature, but this can be a prettier option than a looming fence.

My Favorite Amazon Deal of the Day: The Samsung Galaxy Watch 7

We may earn a commission from links on this page. Deal pricing and availability subject to change after time of publication.

Apple users have Apple Watches, but it’s not so simple for Android users. Depending if you have a Pixel or a Samsung phone, there’s a smartwatch that will work seamlessly better with your phone. If you have a Samsung, there’s no doubt that a Samsung Galaxy Watch is the option. For most people, that watch is the Samsung Galaxy Watch 7 (and yes, it works with any Android phone). Right now, you can get it for $289.99 (originally $379.99), the lowest price it has been according to price tracking tools.

The Samsung Galaxy Watch 7 came out in 2024 to an “excellent” review from PCMag for its incredibly accurate heart rate measurement, detailed sleep monitoring data tracking, AI health insights, and overall smooth user experience.

You get a Super AMOLED screen of 44 mm or 40mm, the latter running for $259.99 (originally $349.99) right now. This LTE version means you’re not dependent on wifi to use the internet and can receive calls or listen to music without depending on your phone.

The main downside you’ll see with this smartwatch (and most Galaxy watches) is the short battery life. It lasts about 22 hours, depending on use, according to PCMag’s tests (it can last up to 28 hours if you don’t use the GPS). This means you’ll likely need to charge it overnight or during the day, making the sleep tracking hard to use. However, the watch does charge pretty quickly, reaching max battery in 88 minutes.

If you care about fitness and sleep tracking, it’s hard to beat the Wear OS, especially at its current price. You can have metrics from its Heart Rate Monitor, Blood Oxygen Monitor, Pedometer, Barometer, Temperature Sensor, Gyroscope, and others, and they’re accurate. However, if you can’t get over the short battery, consider the ONEPLUS Watch 2 for $245.34.