A new cybersecurity study has revealed a surprising method that could let hackers steal data from computers that aren't even connected to the internet. The attack uses something as common as a smartwatch — a device worn on the wrist — to receive secret messages sent through sound waves that humans can't hear.
For years, organizations have relied on “air-gapped” computers to keep critical information safe. These systems are physically separated from networks like the internet, making it much harder for outsiders to break in. But researchers from Ben-Gurion University in Israel have shown that isolation alone might not be enough anymore.
The method, called SmartAttack, takes advantage of ultrasonic sound — high-frequency signals beyond the range of human hearing. By using a speaker on the target computer and a microphone on a nearby smartwatch, data can be sent across short distances without anyone noticing.
In the SmartAttack model, a smartwatch isn’t just a gadget. It becomes a silent receiver for data sent from a compromised computer. The computer, already infected with hidden malware, transmits information through high-pitched sound using its speaker. This sound is inaudible to people in the room but can be picked up by the smartwatch microphone.
Infection Stage: The attacker first manages to install malware on the target computer. This could happen through a USB device, a software vulnerability, or even by someone unknowingly bringing in infected hardware.
Compromised Watch: A smartwatch is also infected — either by being physically tampered with or by tricking the user into installing a malicious app.
Data Transmission: The malware inside the computer collects sensitive data like passwords, typing activity, or encryption keys. Then it encodes this data into ultrasonic signals and sends them through the computer’s speaker.
Silent Reception: The smartwatch listens to these signals. Once it receives the hidden data, it can send it out via Wi-Fi, Bluetooth, or even store it until it's near a less secure network.
Leaving No Trace: Because no internet connection is used during the actual data transfer, traditional security systems — which focus on network monitoring — would likely miss the whole thing.
The study tested different angles, distances, and types of speakers. Active desktop speakers worked best, but even standard laptop speakers managed to transmit at low data speeds. The best signal quality was observed when the watch faced the computer directly without any physical obstacles, like the user’s body.
Typing on a keyboard during transmission didn’t interfere much. That’s because keyboard noise happens in a different frequency range than the ultrasonic channel. In tests, the signals remained clear even during active computer use.
And since the process doesn’t involve internet access, firewalls, antivirus programs, and network monitors won’t see the theft happening.
Banning Wearables: The simplest fix would be to block smartwatches from secure locations. But enforcing such a rule in practice is difficult and might affect employee productivity.
Sound Jamming: Security teams could deploy ultrasonic “white noise” to block hidden signals. However, this can interfere with other systems — like voice assistants or medical devices — and might not always work.
Software Firewalls for Sound: Just like network firewalls monitor internet traffic, “acoustic firewalls” could block suspicious sound activity. These are still experimental and may not be reliable yet.
Physical Modifications: The most extreme option is to remove or disable speakers and microphones from secure computers. This eliminates the threat completely but may also limit the device’s functionality.
Places that manage nuclear systems, financial algorithms, or classified projects are the real targets. These environments often rely on air-gapped systems, assuming they’re immune to remote attacks. This research shows that’s no longer a safe assumption.
This research doesn’t suggest smartwatches are evil. But it does show how even the most ordinary tech can become a security risk when combined with clever thinking and overlooked vulnerabilities.
Final Thoughts
As technology grows more connected, isolation isn’t the fortress it once was. What used to be secure simply by being unplugged can now be reached by invisible signals and ordinary devices. The smartwatch on someone’s wrist might not just be telling time — it could be quietly listening.
Image: Harbaksh Singh / Unsplash
Read next: 54% of Americans Now Get News from Social Media, Surpassing TV and News Websites
For years, organizations have relied on “air-gapped” computers to keep critical information safe. These systems are physically separated from networks like the internet, making it much harder for outsiders to break in. But researchers from Ben-Gurion University in Israel have shown that isolation alone might not be enough anymore.
The method, called SmartAttack, takes advantage of ultrasonic sound — high-frequency signals beyond the range of human hearing. By using a speaker on the target computer and a microphone on a nearby smartwatch, data can be sent across short distances without anyone noticing.
How Smartwatches Come Into Play
Smartwatches are often allowed into sensitive areas, even when phones are restricted. Since they’re worn on the wrist, people carry them everywhere — including places where security is tight. These devices have microphones, processors, and wireless connections like Wi-Fi and Bluetooth, making them powerful enough to do more than just count steps or show notifications.In the SmartAttack model, a smartwatch isn’t just a gadget. It becomes a silent receiver for data sent from a compromised computer. The computer, already infected with hidden malware, transmits information through high-pitched sound using its speaker. This sound is inaudible to people in the room but can be picked up by the smartwatch microphone.
A Step-by-Step Breakdown of the Attack
Here’s how this sneaky system might work in the real world:Infection Stage: The attacker first manages to install malware on the target computer. This could happen through a USB device, a software vulnerability, or even by someone unknowingly bringing in infected hardware.
Compromised Watch: A smartwatch is also infected — either by being physically tampered with or by tricking the user into installing a malicious app.
Data Transmission: The malware inside the computer collects sensitive data like passwords, typing activity, or encryption keys. Then it encodes this data into ultrasonic signals and sends them through the computer’s speaker.
Silent Reception: The smartwatch listens to these signals. Once it receives the hidden data, it can send it out via Wi-Fi, Bluetooth, or even store it until it's near a less secure network.
Leaving No Trace: Because no internet connection is used during the actual data transfer, traditional security systems — which focus on network monitoring — would likely miss the whole thing.
Why It Works — And When It Doesn’t
This attack isn’t magic; it relies on the unique positioning and behavior of smartwatches. Because watches move with the hand, the signal’s strength can vary depending on the wearer’s position. The attacker must be close enough — generally within six meters — and the speaker must be capable of generating ultrasonic sound.The study tested different angles, distances, and types of speakers. Active desktop speakers worked best, but even standard laptop speakers managed to transmit at low data speeds. The best signal quality was observed when the watch faced the computer directly without any physical obstacles, like the user’s body.
Typing on a keyboard during transmission didn’t interfere much. That’s because keyboard noise happens in a different frequency range than the ultrasonic channel. In tests, the signals remained clear even during active computer use.
How Fast Can Data Be Stolen?
The transmission speed isn’t fast — it peaks around 50 bits per second. That means large files aren’t practical targets. But short, valuable pieces of information like login credentials, encryption keys, or typed words can be stolen in seconds or minutes.And since the process doesn’t involve internet access, firewalls, antivirus programs, and network monitors won’t see the theft happening.
Can This Be Stopped?
Several ideas have been proposed to defend against such attacks, but each comes with trade-offs.Banning Wearables: The simplest fix would be to block smartwatches from secure locations. But enforcing such a rule in practice is difficult and might affect employee productivity.
Sound Jamming: Security teams could deploy ultrasonic “white noise” to block hidden signals. However, this can interfere with other systems — like voice assistants or medical devices — and might not always work.
Software Firewalls for Sound: Just like network firewalls monitor internet traffic, “acoustic firewalls” could block suspicious sound activity. These are still experimental and may not be reliable yet.
Physical Modifications: The most extreme option is to remove or disable speakers and microphones from secure computers. This eliminates the threat completely but may also limit the device’s functionality.
Who Should Worry?
This isn’t a threat to everyday computer users — at least not yet. The process is too complicated, too slow, and requires physical proximity. But in government, military, or industrial environments where secrets are priceless, even slow data leaks matter.Places that manage nuclear systems, financial algorithms, or classified projects are the real targets. These environments often rely on air-gapped systems, assuming they’re immune to remote attacks. This research shows that’s no longer a safe assumption.
What Makes This Unique
Until now, most attention in ultrasonic hacking focused on smartphones. Smartwatches, however, are worn constantly, stay close to computers, and often go unnoticed by security systems. Their small size and growing technical capabilities make them the perfect covert spy tool — if placed in the wrong hands.This research doesn’t suggest smartwatches are evil. But it does show how even the most ordinary tech can become a security risk when combined with clever thinking and overlooked vulnerabilities.
Final Thoughts
As technology grows more connected, isolation isn’t the fortress it once was. What used to be secure simply by being unplugged can now be reached by invisible signals and ordinary devices. The smartwatch on someone’s wrist might not just be telling time — it could be quietly listening.
Image: Harbaksh Singh / Unsplash
Read next: 54% of Americans Now Get News from Social Media, Surpassing TV and News Websites
This comment has been removed by a blog administrator.
ReplyDelete