OPINION & ORDER
RONNIE ABRAMS, District Judge:Plaintiff James Robinson brings this class action against Defendant Disney Online (“Disney”), alleging violations of the Video Privacy Protection Act (the “VPPA”), 18 U.S.C.. § 2710. He claims that Disney unlawfully disclosed personally *178identifiable information (“PH”) — the encrypted serial number of the digital device he used to access Disney video content, as well as his viewing history — to Adobe, a third-party data analytics company, Adobe purportedly combined these disclosures with additional information gathered from other sources,'and used this composite data to identify Robinson and attribute his viewing history to him. -Before the Court is Disney’s motion to dismiss Robinson’s Amended Complaint pursuant to Fed.R.Civ.P. 12(b)(6). For the reasons that follow, Disney’s motion is granted.
BACKGROUND
Robinson’s Amended Complaint concerns videos he purportedly viewed using a Roku, a “digital media-streaming device that delivers videos, news, games, and other content to consumers’ televisions via the Internet.” Am. Compl. (Dkt. 20) ¶ 1 n. 1. Through the Roku Channel Store — an “online digital media platform” — Robinson downloaded the Disney Channel application, which, once installed on his Roku, allowed him to view Disney’s proprietary video content. Id. ¶¶ 9-10.
“Unbeknownst to its users,” Robinson claims, “each time they use the Disney Channel to watch. videos or television shows, Disney discloses their personally identifiable information — including a record of eveiy video clip viewed by the user ... to unrelated third parties.” Id. ¶ 2; see also id. ¶ 13. He further claims that this record is “sent each time that a user views a video clip,” and is accompanied by the “hashed serial number associated with the user’s Roku device.” Id. ¶ 13. This hashed — or anonymized — serial number is “unique to a ... device and remain[s] constant for the lifetime of that device.” Id. ¶ 18. ■
Disney, according to Robinson, programmed its Roku channel to send this information to Adobe, a third-party data analytics company. See id. ¶¶ 3, 13. Adobe, and companies like it, purportedly maintain “massive digital dossiers on consumers” by aggregating consumer data collected from an array of sources, including applications like the Disney Channel. See id. ¶¶ 22-29. Robinson claims that “Adobe has the capability to use” this aggregated data to “personally identify .,. users and associate their video viewing selections with a personalized profile in its databases.” Id: ¶ 29. ■ . ‘
Robinson “downloaded and began using the Disney Channel on his Roku” device beginning in December 2013. Id. ¶39. He claims that Disney disclosed the hashed serial number of his dfevice and his viewing history to Adobe without his consent, id. ¶ 40, and that this information constitutes PII “in this content because it allows Adobe to identify users ... and to attribute their video viewing records to their existing profiles,” id. ¶ 56. He further alleges that Adobe actually identified him and “attribute[d] his viewing choices to his profile” using the information disclosed. by Disney. Id. ¶ 57.1 Robinson argues that, these disclosures amounted to *179violations of the VPPA. Id. ¶ 59. Disney argues otherwise, and seeks dismissal of Robinson’s Amended Complaint in its entirety. See Dkt. 30. The Court heard oral argument from the parties on October -5; 2015.
LEGAL STANDARD
To survive a motion to dismiss under Fed.R.Civ.P. 12(b)(6), a pleading must contain “a short and plain statement of the claim showing that the pleader is entitled to relief,” Fed.R.Civ.P. 8(a)(2), and be “plausible on its face,” Bell Atl. Corp. v. Twombly, 550 U.S. 544, 570, 127 S.Ct 1955, 167 L.Ed.2d 929 (2007). “A claim has facial plausibility when ■ the plaintiff pleads factual content that allows the court to draw the reasonable inference that-the defendant is liable for the misconduct alleged.” Ashcroft v. Iqbal, 556 U.S. 662, 678, 129 S.Ct. 1937, 173 L.Ed.2d 868 (2009).
DISCUSSION
The VPPA prohibits a “video tape service provider” from “knowingly disclosing], to any person, personally identifiable information concerning any consumer of such provider.” 18 U.S.C. § 2710(b)(1). Its impetus was the publication in “a weekly newspaper in-Washington” of-a> “profile of Judge Robert H. Bork based on the titles of 146 films his family had rented from a video store.” Sen. Rep. 100-599, at 5(1988).
As defined in the VPPA, a “video tape service provider” is “any person, engaged in the business, in or affecting interstate or foreign- commerce, of rental, sale; or delivery of prerecorded video cassette tapes or similar audio visual matérials;’”18 U.S.C. § 2710(a)(4); a “consumer” is “any renter, purchaser, or subscriben* of goods or services from a video tape service provider,” 18 U.S.C. § 2710(a)(1); and “personally identifiable information” (“PII”) “includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider,” 18 U.S.C. § 2710(a)(3). Disney contends that Robinson’s VPPA claim is statutorily precluded, both because he was not a consumer and because the information Disney transmitted to Adobe was not PII. See Mem. (Dkt. 31) 7-10, 16-17. The Court declines to address the former argument, as it concludes that the information Disney disclosed did hot amount to PII.
The precise scope ' of PII under the VPPA is difficult to discern from the face of the statute — whether read in isolation or in its broader statutory context. As defined in Section 2710(a)(3), PII “includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.”
This language suggests that the information disclosed by a video tape service provider must, at the very least, identify a particular person — not -just an anonymous individual — and connect this particular person with his or her viewing history. See In re Hulu Privacy Litig., 2014 WL 1724344, at *7 (N.D.Cal. Apr. 28, 2014) (defining PII as, in part, “information that identifies a specific person and ties that person to particular videos that the person watched”). This construction is consistent with the ordinary meaning of “a person,” as well as the plain .meaning of- the definition’s final element, the requirement that the-disclosed information identify “a person as haying requested or obtained specific video , materials.” 18 U.S.C. § 2710(a)(3). It is also consistent with the VPPA’s legislative history. As explained in the Senate Report Issued in advance of the statute’s-enactment* .personally identifiable information is intended to be transaction-oriented. -It is information that identifies a ■particular person as having *180engaged in a specific transaction with a video tape service provider.” Sen. Rep. 100-599, at 12 (emphasis added). The use of “includes” in the statutory definition is not to the contrary.- See id. (“[PII] is information ....”) (emphasis added).
Less clear is the scope of information encompassed by PII, and how, precisely, this information must identify a person. Importantly, Robinson does not argue that the information disclosed by Disney — a “record of [his viewing] activities ... along with the hashed serial number associated with [his] Roku device,” Am. Compl. ¶¶ 13, 42 — constitutes PII in its own right. Instead, he argues that the information constitutes PII because Adobe, the recipient of Disney’s disclosures, can identify him by “linking]” these disclosures with “existing personal information” obtained elsewhere. See Am. Compl. 27, 29; Opp. 8-10. Indeed, the Court assumes, for the purposes of this motion, that Adobe has actually identified him in this manner. See infra note 1. Disney responds that the VPPA is not targeted at what non-defendant third parties might do -with disclosures by video tape service providers, as PII is solely limited to information which, in and of itself, identifies a person. See Mem. 7-10. Because the anonymized disclosures here do not themselves identify a specific person, Disney contends, they are not prohibited. See id.
Robinson’s theory of liability is not without support in the existing case law. Indeed, Yershov v. Gannett Satellite Info. Network, Inc., 104 F.Supp.3d 135 (D.Mass. 2015), expressly rejects the view of PII urged by Disney. There, the District of Massachusetts concluded that the disclosures at issue — the transmission of viewing records of the USA Today application on an Android device, in addition to the “user’s GPS coordinates and the ... device’s unique identification number” — constituted PII despite requiring additional information before Plaintiff was linked to his video history. Id. at 137-38, 142-43, 145-46. The majority of courts to address this issue, however, have adopted a narrower definition of PII. See In re Nickelodeon Consumer Privacy Litig., 2014 WL 3012873, at *10 (D.N.J. July 2, 2014) (“[PII is] information which must, without more, itself link an actual person to actual video materials.”); Ellis v. Cartoon Network, Inc., 2014 WL 5023535, at *3 (N.D.Ga. Oct. 8, 2014), aff'd on other grounds, 803 F.3d 1251 (11th Cir.2015), (PII not disclosed where the third party to whom an Android ID and viewing history were provided had to “collect information from other sources” to identify the plaintiff); Locklear v. Dow Jones & Co., 101 F.Supp.3d 1312, 1318 (N.D.Ga.2015), abrogated on other grounds, 803 F.3d 1251 (11th Cir.2015), (“[A] Roku serial number, without more, is not akin to identifying a particular person, and therefore, is not PII.” (quotations omitted)); Eichenberger v. ESPN, Inc., C14-463, 2015 WL 7252985 (W.D.Wash. May 7, 2015) (allegation that Adobe “used information gathered from other sources to link plaintiffs Roku device serial number and the record of what videos were watched to plaintiffs identity” failed to state a claim for disclosure of PII under the YPPA). The Court finds this latter, majority view, more persuasive.
A discussion of Yershov is nevertheless instructive. The district court there began its analysis with the premise, rooted in the statutory text, that “a consumer’s name and address are both PII, and ... that the universe of PII is greater than the consumer’s name and address.” 104 F.Supp.3d at 140 (analyzing Section 2710(b)(2)(D)’s exception to the general prohibition on the disclosure of PII, pursuant to which such disclosures are permissible insofar as they consist “solely of the names and addresses of consumers,” among other requirements); accord Nickelodeon, 2014 WL 3012873, at *9 *181(“[N]ames and addresses are but a subset of PII.”); Hulu, 2014 WL 1724344, at *11 (“The statute does not require a name.”); Locklear, 101 F.Supp.3d at 1316 (“[A] person can be identified by more than just their name and address.”). The Court agrees with the Yershov court that names and addresses are PII for the purposes of the VPPA, and that PII, in this statutory context, includes more than just names and addresses; it would be difficult to read the language ,of the statute otherwise. Neither party disputes this premise.
The question for the Court.is whether this premise necessarily leads to the, Yer-shov court’s conclusion’ that information can amount to PII even when it does not, on its own, identify a specific person.. As a practical matter, - it is surely right — or at least often so — that addresses and even names “cannot be linked to a specific person without access to certain additional information.” 104 F.Supp.3d- at 142. Which John Smith, or which Main Street, among thousands? And there is, undoubtedly, an intuitive appeal to the Yershov court’s conclusion that it would thus be “unrealistic to refer to PII as information which must, without more, itself link an actual person to. actual video materials.” Id. at 146. As that court stated, defining PII so narrowly would “preclude a finding that home addresses ... are. PII,” and thus conflict with the VPPA’s plain statutory language. Id. at 142-43.
But in the end, this conclusion is at odds with the VPPA’s particularized definition of PII and is overly expansive. If nearly any piece of information can, with enough effort on behalf of the recipient, .be combined with other information so as to identify a person, then the scope of PII would be- limitless. Accord Nickelodeon, 2014 WL 3012873, at *11 (“Certainly, this type of information, might one day serve as the basis of personal identification after some effort on the part of the recipient, but the same could be said for nearly any type of personal information.”). Whatever the impact of modern digital technologies on the manner in which personal information is shared, stored, and understood by third parties like Adobe, the Court cannot ascribe such an expansive intent to Congress in' enacting the VPPA. It would render meaningless the requirement that the information identify a specific person as having rented or watched specific videos, as all information would, with some work, be - identifying, and would transmute a statute focused on disclosure of specific information to one principally concerned with what third parties might conceivably be able to do with far more limited disclosures.2
It is true, of course, that liability would not be imposed on providers like Disney unless tliey also knew that the information disclosed was personally identifying, see 18 U.S.C. § ’2710(a)(3), but this knowledge requirement would not operate as any real limitation on liability; if virtually all information can, in the end, be identifying, -it is hard to conceive of a case in.which a disclosure would not be considered knowing. Other limiting principles might plausibly be read into the VPPA to address the concern of overbreadth. For instance, information might constitute PII only if the third-party recipient has the ability, at the moment of disclosure (and not just theoretically), to combine it with other information and identify the underlying consumer. But such a principle would necessarily ei*182ther read into the VPPA a requirement that providers not only know the nature of the information actually disclosed by them, but also'know the informational capabilities of any third-party recipient, or, to the. extent “knowing” is limited to knowledge of disclosure, hold providers liable even where the ability of third-party recipients to compile identifying information was unknown to" them. Both constructions are unsupported by the statutory text.
Indeed, the most natural reading of PII suggests that it is the information actually “disclos[ed]” by a “video tape service . provider,” 18, U.S.C. §. 2710(b)(1), which must itself do the. identifying that is relevant for purposes of the VPPA (literally, “information which identifies”) — not information,, disclosed by a provider, plus other pieces of information collected elsewhere by non-defendant third parties. This is the argument urged by Disney, and it is the definition of PII tha]; this Court now adopts. PII is information which itself identifies a particular person as having-accessed specific video materials'. ,
That names and addresses are expressly included within the definition of PII, as is clear from the face of the VPPA, see 28 U.S.C. § 2710(b)(2)(D), does not foreclose this construction, even .recognizing that names and addresses may, as the Yershov court noted, require additional information before they identify specific people. Instead, the inclusion of names and addresses as examples of PII in the VPPA suggests that Congress considered names and addresses to be sufficiently identifying without more. That is, -a stronger reading of the VPPA suggests that these pieces of information' are per se ' identifying ' such that their knowing disclosure amounts to a violation of the statute. ■
Nor is the Court’s reading foreclosed by the VPPA’s use of the word “includes” in the statutory definition of PII. The Senate Report’ accompanying the VPPA, which notes that “paragraph (a)(3) uses the word ‘includes’ to -establish la minimum, but not exclusive, definition of personally identifiable information,” makes clear that “includes” is used 'not' to suggest' that PII encompasses moré' ‘ than “information which identifies a person as having requested ... specific video materials,” but instead to signal that PII must, at'the very least, “identiffy] a particular person as having- engaged in a specific transaction.” Sen. Rep. 100-599, at 11-12, To be PII, information must identify a specific person and must tie- this person-to specific video materials; it can do no less, but the scope of what constitutes PII is not otherwise limited.
None of which is to say that' context is irrelevant. ’ Context may matter, for instance, to the extent other information disclosed by the provider permits a “mutual understanding that there has been a disclosure” of PII. In re Hulu Privacy Litig., 86 F.Supp.3d 1090, 1097 (N.D.Cal.2015). Thus, as the Hulu court concluded, although “a unique anonymized ID alone is not PII ... context could render it not anonymous and the equivalent of the identification of a specific person”; “[o]ne could not skirt liability under the VPPA .by disclosing a unique identifier and a correlated lookup table.” 2014 WL 1724344, at *11.3 Disney could not dis*183close the information at issue here, along with a code that enabled Adobe to decrypt the hashed serial number and other information necessary to determine the specific device’s user, and still evade liability. But recognizing that context matters — and that a third-party recipient needs to know the import or nature of the information- it receives for that information to have meaning — is not the same as concluding- that information which is not otherwise PII can somehow become PII because of the potential, however remote, of a third party to “reverse engineer” a disclosure using data gathered from other sources.
Pruitt v. Comcast Cable Holdings, LLC, 100 Fed.Appx. 713 (10th Cir.2004), on which Robinson relies, see Opp. 17, does not suggest otherwise. In Pruitt, current and former Comcast cable subscribers alleged that Comcast had violated the 1984 Cable Communications Privacy Act (“Cable Act”), 47 U.S.C. § 551 et seq., by retaining personally identifiable information in its cable boxes, namely, unique anonymous IDs. See 100 Fed.Appx. at 715-17. They also argued that Comcast could “identify á customer’s viewing habits by connecting the coded information with its billing or management system.” Id. at 716. The district court rejected the first theory of.liability, as did the Tenth Circuit on appeal, holding that “[without the information in the billing or management system one cannot connect the unit address with a specific customer; without the billing information, even Comcast would be unable to identify which individual household was associated with the raw data in the converter box.” Id. And because plaintiffs had not alleged “that the retention of data in the billing or. management systems violates the Cable Act,” the Circuit declined to reach the second theory— that on which Robinson relies — altogether. Id. at 717.
Even, assuming the Tenth Circuit had endorsed this second theory, however, Pruitt does not stand for the. proposition that unique, anonymous IDs, when disclosed to third-parties, become PII. It merely reaffirms the Hulu .court’s suggestion, discussed above, that “[o]ne could not skirt liability under the VPPA, for example, by disclosing a unique identifier and a correlated look-up table.” 2014 WL 1724344, at *11. In Pruitt, Comcast controlled both the information stored in the cable boxes and the correlated information in its billing system. See. 100 Fed.Appx. at 715. Because the Cable Act prohibits storage of PII, and Comcast possessed within its internal systems both a unique identifier and a look-up table, it may well have been liable had plaintiffs properly alleged that theory. But the VPPA, .as noted, is concerned with disclosure, and while Disney did disclose encrypted device serial numbers, it did not disclose a correlated decryption -table or other identifying information.
The definition of PII the Court hereby adopts readily distinguishes between names and addresses, on the one hand, and an anonymized device serial number, on the other. If PII is. information which must itself, identify a particular person as having viewed specific video materials, the primary question for the re-*184viewing court is whether the challenged disclosure similarly identifies a person. Whereas names and addresses, as a statutory matter, do identify a specific person, the anonymized Roku serial number at issue here does not; it identifies a specific device, and nothing more. In light of the Court’s conclusion regarding the definition and scope of PII, Disney’s liability turns only on whether the information it disclosed itself identified a specific person. It did not. Thus, Adobe’s ability to identify Robinson by linking this disclosure with other information is of little significance.
Finally, Robinson has not alleged that the hashed serial number of his Roku device amounts to a geographic identifier. See Am. Compl. ¶ 13. It is thus unlike a home address, which ties a specific person to a specific place. Nor is the information disclosed by Disney equivalent to a Face-book ID. A “Facebook user — even one using a nickname — generally is an identified person on a social network platform.” Hulu, 2014 WL 1724344, at *14. A Face-book ID, as the Hulu court found, is thus equivalent to a name — it stands in for a specific person, unlike a device identifier. See id, Disney has also not disclosed a “correlated look-up table” that would enable Adobe to link the hashed serial number of Robinson’s Roku device and his viewing choices to his identity. Instead, as Robinson himself alleges, it is Adobe, not Disney, which has purportedly assembled the equivalent of a “look-up table” — with information obtained from third-party sources, including Roku. See Am. Compl. 22-29. This is insufficient to constitute a violation of the VPPA.
CONCLUSION
As alleged, the disclosures at issue here indicate only that a specific device somewhere was used by someone to watch specific videos. Id. ¶ 13. An unrelated third party equipped with the information purportedly disclosed by Disney, and nothing more, could not identify Robinson. The third party would not know his name, his age, his gender, his social security number, his home address or any other information tantamount to a physical location, or any similar details that would enable it to identify Robinson as the specific person accessing specific videos on his specific Roku device. The somewhere and someone remain unknown until Adobe purportedly combines Disney’s disclosure with other information collected from elsewhere. See Am. Compl. ¶¶ 27, 29, 55-57. Thus, Robinson’s allegations, as measured against the definition of personally identifiable information adopted by the Court, fail to show th,at he is entitled to relief. Accordingly, Robinson cannot make out a viable claim under the VPPA, and his Amended Complaint must be dismissed.
In dismissing this action, the Court is sensitive to the policy implications posed by the increasing ubiquity of digital technologies, which, as Robinson ably alleges, have dramatically expanded the depth, range, and availability of detailed, highly personal consumer information. There is no doubt that the world of Roku devices, streaming video, and data analytics is a very different one from that of the physical video stores and tape rentals in which the VPPA was originally passed, and that, as the Yershov court noted, deciding VPPA cases today is thus akin to placing “a square peg ... into a round hole.” 104 F.Supp.3d at 140. But while the Court recognizes the frustration of an individual such as Robinson — who seeks to keep his information private — whether it is personally identifying or not, the VPPA as written, and even as amended in 2013, does not *185afford him, or those similarly situated, a remedy.
For the reasons stated above, Disney’s motion to dismiss is granted. The Clerk of Court is requested to terminate the motion pending at Dkt. 30 and close the case.
SO ORDERED.
ORDER
On November 18, 2015, Plaintiff James Robinson moved for the Court to reconsider its October 20, 2015 order, which granted Defendant’s motion to dismiss. In its October 20 order, the Court held that Robinson failed to state a claim under the Video Privacy Protection Act (the “VPPA”), 18 U.S.C. § 2710, because the information Disney had allegedly disclosed — a “hashed” serial number of a digital device Robinson used to access Disney content together with his viewing history — did not constitute personally identifiable information (“PH”), as defined by the VPPA, Robinson now urges the Court to reconsider its ruling, on the grounds that it “overlooked” certain “matters ‘that might reasonably be expected to alter the [Court’s] reasoning.’” PL’s Mem. 2-3 (internal citation omitted). In particular, he argues that the Court failed to understand the “subtle — but important — distinction between singling out and identifying” an individual, and that the test articulated by the Court is “unworkable.” Id. at 3, 10. Under Robinson’s theory, whether a disclosure identifies an individual is a question,for the jury. For the reasons articulated in its October 20 order, the Court disagrees.
“A motion for reconsideration should be granted only when the defendant identifies ‘an intervening change of controlling law, the availability of new evidence, or the need to correct a clear error or prevent manifest injustice.’” Kolel Beth Yechiel Mechil of Tartikov, Inc. v. YLL Irrevocable Trust, 729 F.3d 99, 104 (2d Cir.2013) (quoting Virgin Atl. Airways, Ltd. v. Nat’l Mediation Bd., 956 F.2d 1245, 1255 (2d Cir.1992)). “A party making a motion for reconsideration is not supposed to treat the court’s initial decision as the opening of a dialogue in which that party may then use such a motion to advance new theories or adduce new evidence in response to the court’s rulings, much less rehash arguments already made and rejected.” Pfizer, Inc. v. Stryker Corp., No. 02-CV-8613 (LAK), 2005 WL 383702, at *1 (S.D.N.Y. Feb. 17, 2005) (internal citation and quotations omitted). Plaintiff has done exactly that. His motion does not identify any change of controlling law or present new evidence, nor does he argue that reconsideration will prevent manifest injustice. Instead, he merely attempts to take another bite at the apple; repackaging arguments that this Court has already rejected, and improperly seeking to advance new — and unconvincing — theories. Plaintiffs motion for reconsideration is denied.
SO ORDERED.
. The parties disagree about whether Robinson has adequately alleged that Adobe actually identified him. Disney argues that this allegation — to the extent it is even made in the Amended Complaint — is conclusory and thus insufficient to satisfy the pleading requirements of Fed.R.Civ.P. 8, while Robinson contends that in light of the informational asymmetries in this case, as well as the publicly available information about Adobe's identification capabilities documented in his Amended Complaint, he has sufficiently alleged ■ actual ■ identification. ■ Even if the Court were to resolve this dispute in Robinson's favor, however, his allegations would not be sufficient, a;s a matter of law, to survive Disney’s motion to dismiss. The Court thus assumes, for the purpoáe of deciding this- motion, that-Adobe-did actually identify Robinson,
. Plaintiff's contention, at oral argument; that hashed serial numbers are, like names or social security numbers, just randomized strings of numbers and/or letters, similarly goes too far. Plaintiff's argument merely demonstrates what should already be obvious: Much of human language is symbolic, communicated through systems of letters and numbers. But such a generalized principle is not particularly useful in determining what PII — a statutorily defined term — means in this ■ context.
. The Hulu court's discussion of “context” is consistent with the agency regulations implementing The Family Educational Rights and Privacy Act of 1974 (FERPA), which bars, in part, the disclosure of PII in educational records. Pursuant to these, regulations, PII includes a range of so-called “personal identifiers,” such as a student’s social security .number or biometric record, as well as “other information that, alone or in combination, is linked or linkable to a specific student that would allow a reasonable person in the school community, who does not have personal knowledge of. the relevant circum*183stances, to identify the student with reasonable certainty.” 34 C.F.R. § 99.3. Thus, the regulations are only concerned with the content of disclosures actually made by the educational provider — and are concerned with context only to the extent that multiple pieces of information disclosed by a provider, none of which themselves amount to PII, could, when combined with one another, prove identifying.