Apologizes, this season was set in 1986 but my point remains.
Non-historian here, but I can provide some sourced info.
The internet as we’re used to it today really dates to the 1990s when commercial businesses started being linked to it, with the commercialisation of the main backbone in 1994 under Clinton. The 1980s marked the transition from ARPANET (in 1984), and academic communities were often linked in, especially in the US. However, to my knowledge (and unlike the rest of this post I cannot source this) these were mostly universities, not earlier education, because they had the funding to be connected. Audiences were familiar with the idea of hacking into computers remotely though from an even earlier famous 1980s movie, War Games in 1983, which depicted a teenager remotely accessing a military computer.
That is, while the internet did not exist quite in its current form, interlinked computer networks did, and were known to audiences.
The technology was surprisingly advanced. Modems, which remained the normal way to dial in to the internet, had been around since the 1950s but took off in 1981 with the Hayes Smartmodem which had a mechanism for a computer to send commands to the modem. Modems descending from this technology would be used well into the 2000s; in fact still are if you’re one of the few using dialup internet today. Similarly, the first 32-bit IBM-compatible machine, the Compaq Deskpro 386, was introduced in 1986. 32bit Intel CPUs (with significant improvements) remained dominant for decades and it’s only recently we’ve transitioned to their 64bit successors. (The first consumer Windows OS for 64-bit was the 64bit edition of Windows XP in 2005 almost twenty years later, and it took another decade before 64bit became the normal default CPU.) In other words, 1986 saw the introduction of a computer model that — obviously with important speed and capability improvements — would be the template for the next two decades. 1986 home tech was eminently capable of connecting to remote computers.
Similarly, servers like a DEC PDP-11 or VAX were common, including in schools. Computers were also increasingly widely used and taught: that article has some fascinating photos from the mid-eighties of school computer labs.
Thus, dialling in to a server with consumer technology was well established; the concept of hacking in remotely was established; servers for data were established; home computers had much more capability than might be assumed; and for schools that could afford it, computers were common by the time this season of Stranger Things occurred. In short: other than the question of whether this school could afford it, yes, this was a realistic scene.
I have no knowledge of schools in Indiana and cannot answer about their finances affording computer servers and labs, nor their technology awareness. Someone more familiar than me with the history of funding and technology in rural schools in the USA will need to answer that part.
The scene in Stranger Things is a homage to a scene in Ferris Bueller’s Day Off, a movie from 1986. Then-current audiences would already be familiar with computers speaking to other computers even if they hadn’t used them themselves, but note that Ferris’s high school has affluent families and is located in Chicago, quite different to rural Indiana — thus in a sense the answer to your question is “yes it’s technologically realistic, but whether or not it’s socially realistic it is a nice homage so let’s ignore it.”
Schools in rural Indiana didn’t have access to the Internet then. The only schools that had Internet access were major universities. A teenager in Utah having home Internet access is even less plausible. But that doesn’t mean that the scene you’re asking about is implausible.
The Internet grew out of ARPANET, so-called because it was sponsored by the Advanced Research Projects Agency of the Department of Defense. ARPANET was, and the Internet is, a way to connect a bunch of Local Area Networks so that computers on any of the LANs can exchange data with computers on the other LANs. Many major universities were on ARPANET because they did a lot of defense research, so they had a good reason to be in close contact with DoD labs. By 1986 most of the military’s non-research computing had moved to a different network, MILNET. Since Hawkins National Lab was a research site, it could well be the only place in Hawkins that would have had ARPANET access.
When a computer uses the Internet to reach another computer (say, to post a question on this subreddit), it first contacts a specific other computer called a ‘gateway’. That computer then relays the relevant data to a series of other computers, eventually reaching a gateway that connects directly to a Reddit server. Each of the computers involved has what’s called a ‘routing table’ that helps it know which other computers to contact if trying to reach a certain address. If it’s a lot of data (say, a 20 megapixel cat photo) then it’s broken down into a bunch of ‘packets’ that are independently sent to the destination, potentially via different routes, and reassembled back into the original message. ARPANET worked in basically the same way, but it was vastly smaller than the Internet is now, to the point that it was possible to make complete maps of ARPANET (the linked map is accurate for Season 4’s time period). From the map we see that, in the real world, Purdue University had the only ARPANET access in Indiana.
Later on in Stranger Things, some characters use a phone number to contact a government organization via a computer. They aren’t using the Internet, ARPANET, or MILNET when they do this—one computer makes a direct phone connection to the other and the two computers exchange data directly. The full chain is computer - modem - telephone lines - modem - computer. You don’t need any of the stuff with gateways and routing tables if you have a phone number that connects directly to the specific computer you want to reach. The two modems also aren’t exchanging data digitally—they convert (‘modulate’) it to an audio representation, send that sound across the phone lines, and then convert it back (‘demodulate’) to digital data on the other end. A modem is really just a ‘MOdulator/DEModulator’.
By 1986 the home computer boom was well underway. Two years prior the Census Bureau reported that over 8% of households had one, and this reached 15% by 1989. Apple released the first Macintosh in 1984, and had already sold millions of the predecessor Apple II family. The original IBM PC had come out in 1981, and by 1986 a number of ‘PC clones’ were on the market too. 1986 was also the height of the Commodore 64’s popularity, and the improved Commodore 128 was available as well. Whether or not these computers were capable of connecting to ARPANET is academic as neither the school nor Suzie would have had ARPANET access. But any of them would have been perfectly capable of making a direct connection if the school had a grades database on a computer connected to a modem.
By 1989, two-thirds of teachers were using computers in their schools in the USA. This also presumably extended to computerizing school offices. The hacking scene implies that Hawkins High School must have had a grades database stored on a computer with a modem, possibly set up so that teachers could grade assignments at home and enter grades from a home computer into a central database. All it would require is for Dustin to obtain the phone number for the modem, and for “certified genius” Suzie to gain unauthorized access.
Also bear in mind that Stranger Things is rife with references to ‘80s culture, and one of the top movies of 1986, Ferris Bueller’s Day Off, features the title character hacking his school’s attendance records, so a hacking scene was probably inevitable regardless of plausibility.
Editing with a couple of other ‘80s home hacking references: Wargames (1983) features a child hacking a DoD computer from home, and “Zero Cool” from Hackers (1995) crashed 1,500 business computers from home in 1988.
TL;DR no but also yes.
The original Internet was a loose amalgamation of academic, military, and corporate networks built around Arpanet, which was a national research project from the early 1970s.
You can find maps of the development of Arpanet here:
https://personalpages.manchester.ac.uk/staff/m.dodge/cybergeography/atlas/historical.html
One of the most famous is the map from 1977, which shows every computer on Arpanet.
https://personalpages.manchester.ac.uk/staff/m.dodge/cybergeography/atlas/arpanet1977_large.gif
Most of the machines were PDP-10 (DEC 10... and DEC 20...) and PDP-11 computers made by Digital Equipment Corporation. Without getting into technicalities the PDP-10 was a big but not huge system popular with universities. The PDP-11 was a smaller and simpler computer often used by research departments.
There are also some much bigger and more powerful machines made by IBM (360/...) and Control Data Corporation (CDC...).
This mix of medium/small research and large supercomputing defined the early growth of the network. Supercomputers are large and very expensive and one of the goals was to make that computing power available nationally. (In some cases internationally.) The other machines were workhorse computers used by universities and a handful of corporations (like Xerox) for general purpose academic computing (including admin) as well as computer science and R&D.
By the mid-80s the network spanned universities and research centres in Europe and the US. The UK's own network was called JANET (Joint Academic Network) and it connected the larger universities to Arpanet. Europe had a network called EARN.
But... this was not the Internet. It was more of a collection of independent networks with quirky names (BITNET, MILNET, and others). Each had its own addressing system and "traditions" for access and use.
For example, if you were on a PDP-10 computer you could connect to other PDP-10 computers by typing "SET HOST..." followed by a name. This only worked for PDP-10s. It didn't work for PDP-11 computers (except sometimes - but not always - when they'd been set up to allow it) and it certainly didn't work for IBM and CDC machines.
The different networks were connected by "gateways" - computers which did some basic gluing and conversion and allowed services like email to sort-of-work, up to a point, if you were patient and knew exactly what you were doing.
Modern Internet conventions including email addresses with @ and the four number DNS address system hadn't been standardised yet. The .com/.edu/.gov etc domain naming system was invented in 1983 but wasn't widely used yet.
If you wanted to send email across systems you might need to specify addressed with colons or exclamation marks ("bangs") instead - as explained in this work-in-progress (RFC - "Request for Comment") outline from 1996: https://tldp.org/LDP/nag/node189.html
This was the standard level of complexity for all networked features.
And there was no web. Outside of a few research labs there were no graphics - no windows, no menus, no mouse control. Everything was done by typing commands on a keyboard, and reading the response - which usually appeared one character at a time, because speeds were very limited.
In the mid-80s it was still common to use a teletype - a combined printer and keyboard that printed responses on paper - instead of a keyboard with a screen.
Both computer and network time were monitored. And they cost money. We're used to computing being either free or very cheap. In the 80s it was very, very expensive. Individual users, corporate offices and academic departments typically had a fixed amount of computer time budgeted per month. Going over the quota required a conversation with the system administrators and very probably some extra money.
So to summarise - in 1984 the Internet was:
And therefore not widely available in schools.
There was a push to connect schools once the Intern was opened to the public in the early 90s. Internet Service Providers began operating in November 1989 but the initial offerings were expensive and limited. Once the web started to take off in the mid-90s schools had a credible reason for starting to offer Internet access.
There are a few caveats. A handful of schools in the US and UK had remote access to a large corporate or university computer. These large computers often included modems and allowed remote users to connect by dialling in. There are a few instances of schools being given - or sometimes buying - time on a computer.
The most famous is probably Lakeside School, which bought a teletype and a block of computer time in the early 70s which gave Bill Gates a start in computing. Gates later turned to hacking a remote PDP-10 for free computer time, - initially unofficially, later with the knowledge of the corporation which owned the computer.
The other is that by the mid-80s schools in the UK were quite likely to have one or more microcomputers - cheap single-user computers that were just about powerful enough for simple programming, games, and office work. The UK ran a computer literacy project with the aid of the BBC, and many schools joined in.
The US lagged behind, but some schools had "professional" microcomputers - technically S-100 systems - used mostly for admin. This would have included letters, scheduling, and grade records. But these computers would not have been connected to a national network.
http://www.s100computers.com/Hardware%20Folder/Cromemco/History/History.htm
More generally: in the 80s wide area networked computing was expensive, difficult, and not suitable for public use. The Internet didn't exist in the form it has today, and almost no schools in the US had access to it.
But some schools did have small office computers, sometimes on an office network, and a talented user might not have found them difficult to hack into - as long as they were sitting in front of one.