Jump to content

United States v. Google/Findings of Fact/Section 2I

From Wikisource

I. User Data and Privacy

116. Google recognizes that users increasingly care about the privacy of their online activity. See generally UPX1069. See Tr. at 7471:5-25 (Raghavan); id. at 8994:22–8995:1 (Fitzpatrick) (“[E]xpectations around privacy from our users from, frankly, society across the tech industry, have evolved pretty significantly.”); id. at 8995:13-16 (Fitzpatrick) (noting that “focus on privacy as a topic has really elevated and increased” recently). So do browser developers, see id. at 2484:6-11 (Cue) (Apple); M. Baker Dep. Tr. at 117:8–118:7 (Mozilla), and other GSEs, Tr. at 3677:19–3679:16 (Ramaswamy) (Neeva); UPX720 at 249–53 (DDG).

117. Google has a Privacy, Safety, and Security team that focuses, among other things, “on both building proactive privacy protections into [Google] products, as well as building technical privacy protections into [the] systems and infrastructure,” and “keeping users safe in [Google] products.” Tr. at 8989:19-24 (Fitzpatrick). Google surveys users about its privacy offerings. See, e.g., DX183 (2020 study assessing user trust related to privacy).

118. When Google makes decisions about privacy-focused features, rivals’ privacy offerings are “something [Google] keep[s] an eye on” as one of “many” data points.” Tr. at 8998:1-4 (Fitzpatrick). Google several times has considered undertaking privacy initiatives after looking to rivals. See, e.g., UPX811 at 420 (comparing Google to DDG and recommending Google adopt certain features); UPX794 at 146 (same).

119. But Google also considers the business case for making privacy-focused changes. UPX501 at 520 (2019 email from Raghavan stating that merely because “people care increasingly about privacy” and “DDG is making a lot [of] noise about it,” it did not mean that Google needed “a product change”); see Tr. at 7411:17-21 (Raghavan) (“And the team that came forward with the proposal said we need to do exactly what [DDG’s] doing. And my pushback was maybe we do, maybe we don’t, but I’d like to see the data on the impact on users, and on our ability to build a good search and search ad system.”).

120. Google believes that there is a trade-off between search quality and user privacy. See Tr. at 8998:1-7 (Fitzpatrick) (“But when we’re designing, whether it’s a product overall, a new feature, or a privacy control or capability, end of the day the question is: How do we do what’s right for our users?”); id. at 7475:1-2 (Raghavan) (agreeing that an incognito mode feature could be accomplished “[a]s a technical matter,” but “[t]hat doesn’t make a good product design”); UPX500 at 518 (“DDG might also not be the best model for Google users’ privacy needs[.]”); UPX501 at 520 (“I want to see evidence that there’s a real impact on Google users, attributable to” privacy.).

121. The degree of privacy a GSE offers reflects a series of individual design decisions. Whether to track a user’s sessions data is one such decision. According to Google, tracking user sessions is “measurably beneficial to the user experience, including things like []in-session use of context to improve results.” Tr. at 9035:22–9036:1 (Fitzpatrick). Such data also helps to tailor the advertisements that Google delivers to a user. See id. at 7457:23–7458:9 (Raghavan); id. at 9069:15-23 (Fitzpatrick). DDG, on the other hand, anonymizes user click data and does not track user sessions. Id. at 2050:24–2051:7 (Weinberg). It therefore cannot discern whether multiple searches are the same user performing different actions. Id. at 2051:3-7 (Weinberg); id. at 1944:14-18 (Weinberg) (“[I]f 100 people search for cat pictures today, we don’t really know whether it’s like one person or 100 different people.”).

122. How a GSE uses IP addresses is another design decision. Google logs IP addresses and uses them to customize search results. See, e.g., id. at 1772:22–1773:15 (Lehman) (“[K]nowing a person’s . . . location can sometimes help understand what it is they’re looking for.”); id. at 1778:16-18 (“[I]n general, showing people search results that are appropriate to their location for a certain query is important[.]”). DDG, in contrast, does not log IP addresses. Instead, it “use[s] the location that [it] get[s] via the IP address, and then [it] throw[s] it away after the search is done.” Id. at 2085:25–2086:1 (Weinberg).

123. Google also logs IP addresses to enhance security. Id. at 7413:25–7414:10 (Raghavan) (Google logs IP addresses to detect and combat botnets and fraudulent clicks). DDG “had developed [its] own click fraud systems” that do not require logging of IP addresses. Id. at 2069:10-11 (Weinberg); DX621 at 100.

124. Another question of privacy design is whether to invite users to “sign in.” Google does so because it believes such functionality improves search results and overall search engine quality. See Tr. at 3737:5-8 (Ramaswamy) (personalization improves search quality). DDG does not have an option for users to “sign in” to its platform. Id. at 1944:14-15 (Weinberg) (“[E]very time you search on DuckDuckGo, it’s like it’s your first time[.]”).

125. How much user data a GSE retains also is a measure of privacy. Google chose to retain 18 months, even though some survey data suggested users preferred a shorter retention period. UPX996 at 978 (49% of users surveyed would prefer that Google stored one month or less data, and 74% wanted Google to store their data for under one year). The decision to retain 18 months of a user’s data versus fewer months was largely arbitrary. Tr. at 9013:9-18 (Fitzpatrick) (While Google “felt like it was important to have a default that was greater than that one-year boundary to allow for . . . annual seasonality [of information] to still be preserved,” the decision to default to 18 months (as opposed to 13 months) was because 13 “felt like a really weird number” and 18 months “just felt a little . . . better.”).