the Artificial Intelligence Lab at MIT had received an elegant new printer from Xerox. The printer, however, had an unfortunate tendency to jam, causing print jobs to pile up and nothing to get printed until someone happened to notice and fix the jam.
For Richard Stallman, one of the programmers at the AI Lab, this wasn’t such a big deal. With their previous printer, Stallman had simply changed the printer driver to detect whether the printer was jammed and, if it was, to notify anyone who had sent it a print job.
“If you got that message, you couldn’t assume somebody else would fix it,” Stallman later recalled. “You had to go to the printer. A minute or two after the printer got in trouble, the two or three people who got messages arrive to fix the machine. Of those two or three people, one of them, at least, would usually know how to fix the problem.” (Free as in Freedom, ch. 1)
But the Xerox printer was different: Xerox hadn’t provided the lab with the source code to their printer drivers. There was no way for Stallman to add this new functionality to the driver. When Stallman asked Xerox for the code, they refused to provide it, insisting that it was an important trade secret for their business. And when Stallman found a student at Carnegie Mellon who had been given access to the software, that student also refused to provide a copy, saying he’d signed a contract with Xerox not to share it.
Stallman was outraged. Computer software was supposed to be a tool to serve people; that’s why he and his labmates spent their time writing software. And yet, through a combination of greed and legal restrictions, people were forced to suffer because they were prevented from improving these tools.
The site this links back to is worth checking out as it seems to be some opening discussions about the way that our data can be kept relatively secure and personal in an age of web2.0 apps.
No comments:
Post a Comment