When people delete their cookies or switch to private browsing, they often assume they’ve stepped out of view, leaving behind little that can be used to track their activity across the internet. However, research conducted by computer scientists at Texas A&M University and Johns Hopkins University has shown that this belief is becoming less true over time. Rather than relying on cookies alone, many advertising systems have shifted to a more elusive and far-reaching technique known as browser fingerprinting, which quietly pieces together a digital profile using small bits of information leaked by the browser itself.
This fingerprint, built invisibly in the background, is created using elements like the user’s screen size, device type, operating system, installed fonts, time zone, and even behaviors observed during browsing, such as mouse movements and scrolling patterns. While these individual traits might not seem revealing, their combination often results in a fingerprint distinctive enough to identify a person’s browser from others across repeated visits. Unlike cookies, which can be cleared or blocked with some effort, fingerprints are harder to disguise or eliminate, especially when websites layer them together using different detection methods.
To investigate how this kind of tracking is applied in the real world, the research team developed a large-scale measurement system called FPTrace. This framework was specifically designed to behave like an ordinary internet user, capable of visiting various websites, interacting with content, and capturing advertising data along the way. But unlike regular browsers, this tool allowed researchers to systematically adjust the browser's fingerprint and study how those changes affected the ads being displayed, the value of bids submitted by advertisers, and the extent of data-sharing events logged in the background. By repeating visits under different conditions, the system revealed patterns in how tracking systems responded to subtle identity shifts.
Their findings confirm that browser fingerprinting is not just a technical curiosity or a niche tactic for fraud prevention but is actively being used to monitor users and tailor advertising content, even after users believe they’ve opted out or erased their tracking history. In controlled experiments, researchers observed that even when all cookies had been removed, modifying the fingerprint alone caused noticeable changes in advertiser behavior. Bids from advertisers fluctuated depending on whether a genuine or spoofed fingerprint was presented, and the number of data sync events, where information is shared between multiple parties, declined sharply when the fingerprint changed. This points to a direct link between the fingerprint and how the advertising system identifies and values each visitor.
What’s more revealing is that this kind of tracking continued even under legal environments that are supposed to prevent it. The team tested browser activity in scenarios simulating users covered by privacy laws such as Europe’s GDPR and California’s CCPA. Despite clearly selecting options to reject tracking, the test browsers were still followed through fingerprint-based methods. This silent circumvention of legal protections suggests that many compliance tools offered by websites, often through pop-ups and consent banners, may not fully block more advanced methods of identification.
During the study, fingerprinting was also examined in relation to a process known as cookie restoration, where deleted cookies appear to be quietly reinstated by the browser or websites. While researchers recorded hundreds of cases where cookies were recovered in scenarios involving fingerprinting, the evidence stopped short of proving that fingerprinting alone was responsible for bringing the cookies back. Still, the fact that similar cookies re-appeared after removal, and differed depending on whether a fingerprint had been altered, raises questions about how resilient trackers can be when working in tandem with browser-level identifiers.
The implications of this research reach beyond a simple privacy concern. Fingerprinting allows websites and advertisers to profile users without permission, in ways that are difficult for users to notice or control. Most mainstream privacy tools, including those built into popular browsers, struggle to fully shield users from these techniques, especially as fingerprinting evolves to take advantage of inconsistencies across software, hardware, and browsing behavior. The study also highlights how difficult it is to audit or regulate these practices, since fingerprinting often operates in areas not directly addressed by current legal frameworks.
By introducing FPTrace, the researchers aim to give regulators and developers a way to evaluate fingerprint-based tracking more transparently. This system doesn’t just flag the presence of fingerprinting code, it links these techniques to actual outcomes in ad targeting, bidding, and data collection, showing where and how they’re being used to follow people online. The hope is that by shedding light on these hidden activities, new standards can emerge to hold advertising networks and websites accountable, particularly when users have not granted permission for such monitoring.
This study offers some of the first direct evidence tying browser fingerprinting to online ad tracking in a structured and measurable way. By going beyond static code analysis and focusing on how fingerprints influence economic behavior in real time, it changes the way online privacy should be understood. As the online advertising world grows more reliant on passive, hard-to-block techniques, this research underscores the growing gap between what privacy tools promise and what they can actually deliver.
Image: DIW-Aigen
Read next: ChatGPT Tested With Nonwords, Shows Surprising Language Intuition
This fingerprint, built invisibly in the background, is created using elements like the user’s screen size, device type, operating system, installed fonts, time zone, and even behaviors observed during browsing, such as mouse movements and scrolling patterns. While these individual traits might not seem revealing, their combination often results in a fingerprint distinctive enough to identify a person’s browser from others across repeated visits. Unlike cookies, which can be cleared or blocked with some effort, fingerprints are harder to disguise or eliminate, especially when websites layer them together using different detection methods.
To investigate how this kind of tracking is applied in the real world, the research team developed a large-scale measurement system called FPTrace. This framework was specifically designed to behave like an ordinary internet user, capable of visiting various websites, interacting with content, and capturing advertising data along the way. But unlike regular browsers, this tool allowed researchers to systematically adjust the browser's fingerprint and study how those changes affected the ads being displayed, the value of bids submitted by advertisers, and the extent of data-sharing events logged in the background. By repeating visits under different conditions, the system revealed patterns in how tracking systems responded to subtle identity shifts.
Their findings confirm that browser fingerprinting is not just a technical curiosity or a niche tactic for fraud prevention but is actively being used to monitor users and tailor advertising content, even after users believe they’ve opted out or erased their tracking history. In controlled experiments, researchers observed that even when all cookies had been removed, modifying the fingerprint alone caused noticeable changes in advertiser behavior. Bids from advertisers fluctuated depending on whether a genuine or spoofed fingerprint was presented, and the number of data sync events, where information is shared between multiple parties, declined sharply when the fingerprint changed. This points to a direct link between the fingerprint and how the advertising system identifies and values each visitor.
What’s more revealing is that this kind of tracking continued even under legal environments that are supposed to prevent it. The team tested browser activity in scenarios simulating users covered by privacy laws such as Europe’s GDPR and California’s CCPA. Despite clearly selecting options to reject tracking, the test browsers were still followed through fingerprint-based methods. This silent circumvention of legal protections suggests that many compliance tools offered by websites, often through pop-ups and consent banners, may not fully block more advanced methods of identification.
During the study, fingerprinting was also examined in relation to a process known as cookie restoration, where deleted cookies appear to be quietly reinstated by the browser or websites. While researchers recorded hundreds of cases where cookies were recovered in scenarios involving fingerprinting, the evidence stopped short of proving that fingerprinting alone was responsible for bringing the cookies back. Still, the fact that similar cookies re-appeared after removal, and differed depending on whether a fingerprint had been altered, raises questions about how resilient trackers can be when working in tandem with browser-level identifiers.
The implications of this research reach beyond a simple privacy concern. Fingerprinting allows websites and advertisers to profile users without permission, in ways that are difficult for users to notice or control. Most mainstream privacy tools, including those built into popular browsers, struggle to fully shield users from these techniques, especially as fingerprinting evolves to take advantage of inconsistencies across software, hardware, and browsing behavior. The study also highlights how difficult it is to audit or regulate these practices, since fingerprinting often operates in areas not directly addressed by current legal frameworks.
By introducing FPTrace, the researchers aim to give regulators and developers a way to evaluate fingerprint-based tracking more transparently. This system doesn’t just flag the presence of fingerprinting code, it links these techniques to actual outcomes in ad targeting, bidding, and data collection, showing where and how they’re being used to follow people online. The hope is that by shedding light on these hidden activities, new standards can emerge to hold advertising networks and websites accountable, particularly when users have not granted permission for such monitoring.
This study offers some of the first direct evidence tying browser fingerprinting to online ad tracking in a structured and measurable way. By going beyond static code analysis and focusing on how fingerprints influence economic behavior in real time, it changes the way online privacy should be understood. As the online advertising world grows more reliant on passive, hard-to-block techniques, this research underscores the growing gap between what privacy tools promise and what they can actually deliver.
Image: DIW-Aigen
Read next: ChatGPT Tested With Nonwords, Shows Surprising Language Intuition