Daily IP: When was the computer invented? -Lexicology

2021-12-14 23:51:21 By : Ms. Eartha Zhu

Check the performance and reach of your content.

Become your target audience's first-choice resource for acquiring today's hottest topics.

Understand the customer’s strategy and the most pressing issues they face.

Stay ahead of your main competitors and benchmark against them.

problem? Please contact [email protection]

Given that the topic of our recent article in the Everyday IP series is cars, you may think that we cannot choose an invention that is more important to people's daily lives...but computers may fit this description. After all, at this moment, you are using a computer to read this article. Whether it is your home laptop, work laptop, or smart phone in your hand, its processing power is better than the latest desktop computer.

Computers (and computing) are not only vital to the personal and professional lives of many people, but they also play a key role in the creation, diffusion, and protection of intellectual property rights. With this in mind, let us briefly review the history of these machines while studying how they have affected the IP industry.

The first computer: from the Greek invention and the oriental abacus to Charles Babbage

Ancient Greece can be considered to have created the original text that can be reasonably considered a computer: a machine called the Antikythera mechanism. The history of the invention of the manual machine may be traced back to 200 BC. It consists of a small wooden box (approximately 34 x 18 x 9 cm), which contains as many as 30 gears. Presumably, it was used in early astronomical calculations.

However, more practical is the abacus. Many ancient civilizations created these counting machines, including Sumer, Babylon, Persia, Greece, Rome, and China. In the earliest iteration of the Mesopotamian era, the abacus was a board that could trace the letters in the sand on its surface. Later devices used thin metal rods arranged in parallel in a simple rectangular frame, and beads, marbles or other similar objects were used as counters. They can move back and forth to perform all four arithmetic operations: addition, subtraction, multiplication, and division, as well as the basic square root equation.

Today, you can find abacus in almost any lower grade classroom of elementary school, and abacus is still popular in Middle Eastern countries, China and Japan.

Over 1,500 years, we have returned to the origin of the term "computer". When it was first coined in 1613, the term described people with mathematical skills to perform complex calculations. About 188 years later, in 1801, textile merchant Joseph Marie Jacquard designed and manufactured a loom that used wooden punch cards to guide certain functions of the machine, heralding what would appear more than a century later. Computer elements.

An automatic computer capable of performing complex calculations. The analysis machine was designed by Charles Babbage as an improvement on his original difference machine. (Image source: Mrjohncummings/Wikimedia Commons/CC BY-SA 2.0)

A bigger development is the difference engine of the British mathematician Charles Babbage. In 1822, this computing pioneer conceived a steam-powered computer that could complete equations involving a large number of digital tables. He never managed to complete a fully functional model, but used his analysis engine to improve the design in 1834. Although only part of it was built before his death, programmable machines used punch cards for arithmetic calculations and basic conditional branching, loops, microprogramming, parallel processing and other calculations commonly found in modern computers.

Reinventing the wheel: the early supercomputers

Although not realized in life, Babbage's vision heralded the birth of room-sized computers in the 1940s, such as the Harvard Mark I and Electronic Numerical Integrator and Computer (ENIAC). Unfortunately, from his death in 1871 to the 1940s, most of the work of this British pioneer became obscured over time.

For example, John Mauchly, the co-creator of ENIAC, did not know that Babbage used punch cards, nor did he understand that his invention owed a large debt to the British in other ways. To the best of our knowledge, Babbage did not try to patent a difference engine or analysis engine, which may not help keep his name alive, nor does it help keep his innovation at the forefront of research. In the end, the creators of ENIAC and Mark I basically repeated Babbage's steps in various ways without realizing it.

But this era has many primitive watersheds of its own, especially Alan Turing's "general computing machine" concept. In a 1936 paper, Turing assumed that a computer programmed with instructions stored in the machine (as tape data at the time) could calculate almost any mathematical or scientific equation. His research will become an integral part of personal computer development and the earliest artificial intelligence theory. Sadly, Turing was unable to make full use of his genius-not because he did not protect his intellectual property rights, but because of society's prejudice against his sexual orientation. This persecution eventually led to a shocking arrest, conviction, and imprisonment in 1952, after which he was forced to commit suicide two years later.

Intellectual Property Wars in Silicon Valley (and beyond)

Compared with the depression and obscurity of its predecessors, some of the most prominent figures associated with modern computing landmarks are household names today. Think of Microsoft's Bill Gates and Apple's Steve Jobs, and to a lesser extent, Apple co-founder Steve Wozniak. Other little-known but notable achievements have been made: Alan Shugart pioneered the introduction of floppy disks in 1971, Bjarne Stroustrup invented the C++ programming language in 1980, and Tim Berners-Lee created HTML in 1990 (and the world’s first A web browser).

Macintosh 128K said "Hello" to the world on January 24, 1984, and was released by Apple Computers Inc. It contains an Easter egg in the OS ROM, designed to prevent unauthorized Macintosh clones.

New developments in the computing field continue to emerge, and their speed has become increasingly crazy since the 1980s. The pressure to develop the next big thing (and arguably the lack of emotion in the industry) sometimes forces everyone from software developers and hardware engineers to product designers to "borrow" the work of others. This has led to countless and often lengthy legal battles.

Science and technology intellectual property cases have reached the Supreme Court of the United States. A recent example is Google v. Oracle, which accused the former of copying application programming interfaces (APIs) from its Java code. (Because Google requested the Supreme Court to hear this lawsuit, this case is no longer called Oracle v. Google.) However, since Java is open source, Google is always likely to win, and the case will start with the 6-2 Supreme Court in April 2021. The ruling is made about 10 years later. In a similar protracted case, Apple and Samsung reached a settlement in 2018 in favor of the former, seven years after Apple claimed that Samsung "blindly" copied the iPhone design for its own smartphone.

Looking at these issues from a broader perspective, under the guidance of Gates, Microsoft actively pursues anyone suspected of infringing its intellectual property rights, but it also has a tortuous history of monopolistic and anti-competitive practices, even including elements suspected of stealing MS-DOS from Early CP/M operating system. Many other computing giants have also been charged with similar crimes.

Most of the assets of most organizations are intangible intellectual property rights, so most areas of intellectual property today are characterized by technical designs and patents. In the foreseeable future, technology-related disputes seem likely to become the largest share of intellectual property lawsuits.

If you want to know how Lexology can push your content marketing strategy forward, please send an email to [Email Protection].

"Lexology is a useful and informative tool. I keep copies of relevant articles and forward them to colleagues often. Although I don’t know all the authors/companies, I do understand their opinions about Appreciation of a subject, if there is a need, I will not hesitate to contact them on these subjects."

© Copyright 2006-2021 Legal Business Research