This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

miércoles, 25 de enero de 2023

History of artificial intelligence


 1936 ECB

Alan Turing

In 1936 Alan Turing formally designed a universal machine that demonstrates the feasibility of a physical device to implement any formally defined computation.

384 ECB

Before Christ

The most basic ideas date back to the Greeks, before Christ. Aristotle (384-322 BC) was the first to describe a set of rules that describe a part of the functioning of the mind to obtain rational conclusions.

 

250 ECB

First self-controlled machine

Ctesibius of Alexandria (250 BC) built the first self-controlled machine, a regulator of water flow (rational but unreasonable).

1315

Ramon Llull

In 1315 Ramon Llull in his book Ars magna had the idea that reasoning could be carried out artificially.

 

1637

Influential

In 1637, one of the most influential philosophers of the seventeenth century predicted the possibility of creating machines that thought for themselves. This figure was René Descartes.

 

 


1847

Logical reasoning

After years of pause in this sense, in 1847, the mathematician George Boole put one more component to this story, establishing that logical reasoning could be systematized, in the same way that a mathematical equation is solved

 

1877

First-Order Logic

30 years later, Gottlob Frege from Boole's studies obtains the First Order Logic, which had greater emphasis and better expression. It is still being taken as a reference today.

 

 


 

1943

Warren McCulloch and Walter Pitts

In 1943 Warren McCulloch and Walter Pitts presented their model of artificial neurons, which is considered the first work in the field, although the term did not yet exist.

 

 


1950

First Advances

In 1950 the first important advances began with the work of Alan Turing, from which science has gone through various situations.

 




 

1955

Herbert Simon, Allen Newell and J.C. Shaw

In 1955 Herbert Simon, Allen Newell and J. C. Shaw, developed the first programming language oriented to problem solving, IPL-11.

 

 


1956

Herbert Simon, Allen Newell, and J. C. Shaw

In 1956 Herbert Simon, Allen Newell and J. C. Shaw, a year later developed the LogicTheorist, which was able to prove mathematical theorems.

 

 


1956

John McCarthy, Marvin Minsky and Claude Shannon

In 1956 the term artificial intelligence was invented by John McCarthy, Marvin Minsky and Claude Shannon at the Dartmouth Conference, a congress in which triumphalist forecasts were made ten years that were never fulfilled, which caused the almost total abandonment of research for fifteen years.

 


2006

Artificial Intelligence 20

In 2006 the anniversary was celebrated with the Congress in Spanish 50 years of Artificial Intelligence - Multidisciplinary Campus in Perception and Intelligence 2006.

 

 




2011

From the year 2009 - 2011

• In 2009 there are already therapeutic intelligent systems under development that allow detecting emotions to interact with autistic children.

• In 2011, IBM developed a supercomputer called Watson, which won a round of three straight games of Jeopardy!, beating its top two champions, and winning a $1 million prize that IBM then donated to charity.13

 

 


2016

2016

• In 2016, a computer program beat triple European Go.14 champion five to zero

• In 2016, there are people who, when unknowingly dialoguing with a chatbot, do not realize talking to a program, so that the Turing test is fulfilled as when it was formulated: "Artificial Intelligence will exist when we are not able to distinguish between a human being and a computer program in a blind conversation".

Viruses

 viruses



descargar 

Quantum Computing

Computación cuántica 


 

descarga aquí el documento 

History of computers

 1937:

Called the “Model K” Adder because he built it on his “Kitchen” table, this simple demonstration circuit provides proof of concept for applying Boolean logic to the design of computers, resulting in construction of the relay-based Model I Complex Calculator in 1939. That same year in Germany, engineer Konrad Zuse built his Z2 computer, also using telephone company relays.

 

                             
                               


                     1939:

David Packard and Bill Hewlett found their company in a Palo Alto, California garage. Their first product, the HP 200A Audio   Oscillator, rapidly became a popular piece of test equipment for engineers. Walt Disney Pictures ordered eight of the 200B model to test recording equipment and speaker systems for the 12 specially equipped theatres that showed the movie “Fantasia” in 1940.

 

1940:

In 1939, Bell Telephone Laboratories completes this calculator, designed by scientist George Stibitz. In 1940, Stibitz demonstrated the CNC at an American Mathematical Society conference held at Dartmouth College. Stibitz stunned the group by performing calculations remotely on the CNC (located in New York City) using a Teletype terminal connected to New York over special telephone lines. This is likely the first example of remote access computing.



 

 


1941:

Built as an electro-mechanical means of decrypting Nazi ENIGMA-based military communications during World War II, the British Bombe is conceived of by computer pioneer Alan Turing and Harold Keen of the British Tabulating Machine Company. Hundreds of allied bombes were built in order to determine the daily rotor start positions of Enigma cipher machines, which in turn allowed the Allies to decrypt German messages. The basic idea for bombes came from Polish code-breaker Marian Rejewski's 1938 "Bomba."

 

 


:1944

Designed by British engineer Tommy Flowers, the Colossus is designed to break the complex Lorenz ciphers used by the Nazis during World War II. A total of ten Colossi were delivered, each using as many as 2,500 vacuum tubes. A series of pulleys transported continuous rolls of punched paper tape containing possible solutions to a particular code. Colossus reduced the time to break Lorenz messages from weeks to hours. Most historians believe that the use of Colossus machines significantly shortened the war by providing evidence of enemy intentions and beliefs. The machine’s existence was not made public until the 1970s.

 

1948:


University of Manchester researchers Frederic Williams, Tom Kilburn, and Geoff Toothill develop the Small-Scale Experimental Machine (SSEM), better known as the Manchester "Baby." The Baby was built to test a new memory technology developed by Williams and Kilburn -- soon known as the Williams Tube – which was the first high-speed electronic random access memory for computers. Their first program, consisting of seventeen instructions and written by Kilburn, ran on June 21st, 1948. This was the first program in history to run on a digital, electronic, stored-program computer.

 

 

 

1949:


While many early digital computers were based on similar designs, such as the IAS and its copies, others are unique designs, like the CSIRAC. Built in Sydney, Australia by the Council of Scientific and Industrial Research for use in its Radio physics Laboratory in Sydney, CSIRAC was designed by British-born Trevor Pearcey, and used unusual 12-hole paper tape. It was transferred to the Department of Physics at the University of Melbourne in 1955 and remained in service until 1964.

 

1955:

A commercial version of Alan Turing's Pilot ACE, called DEUCE—the Digital Electronic Universal Computing Engine -- is used mostly for science and engineering problems and a few commercial applications. Over 30 were completed, including one delivered to Australia. 

 



1977:

Sold complete with a main logic board, switching power supply, keyboard, case, manual, game paddles, and cassette tape containing the game Breakout, the Apple-II finds popularity far beyond the hobbyist community which made up Apple’s user community until then. When connected to a color television set, the Apple II produced brilliant color graphics for the time. Millions of Apple IIs were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers. 

 

1981:


The C64, as it is better known, sells for $595, comes with 64 KB of RAM and features impressive graphics. Thousands of software titles were released over the lifespan of the C64 and by the time it was discontinued in 1993, it had sold more than 22 million units. It is recognized by the 2006 Guinness Book of World Records as the greatest selling single computer of all time.

 

1992:

Apple's Macintosh Portable meets with little success in the marketplace and leads to a complete redesign of Apple's line of portable computers. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of 1990s laptop design. The PowerBook 100 was the entry-level machine, while the PowerBook 140 was more powerful and had a larger memory. The PowerBook 170 was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in 2006. 


2009:


Apple introduces their first ultra notebook – a light, thin laptop with high-capacity battery. The Air incorporated many of the technologies that had been associated with Apple's MacBook line of laptops, including integrated camera, and Wi-Fi capabilities. To reduce its size, the traditional hard drive was replaced with a solid-state disk, the first mass-market computer to do so.

 

 

2012:

The iPad combines many of the popular capabilities of the iPhone, such as built-in high-definition camera, access to the iTunes Store, and audio-video capabilities, but with a nine-inch screen and without the phone. Apps, games, and accessories helped spur the popularity of the iPad and led to its adoption in thousands of different applications from movie making, creating art, making music, inventory control and point-of-sale systems, to name but a few. 

 


History of Microsoft Windows

 Windows 1.0 – 2.0 (1985-1992)

Instead of typing MS-DOS commands, Windows 1.0 allowed users to point and click to access the windows.

In 1987 Microsoft released Windows 2.0, which was designed for the Intel 286 processor. This version added desktop icons, keyboard shortcuts, and improved graphics support.

 

Windows 3.0 – 3.1 (1990–1994)

Microsoft released Windows 3.0 in May 1900 offering better icons, performance and advanced graphics with 16 colors designed for Intel 386 processors. Its popularity grew by manifolds following the release of SDK that helped software developers focus more on writing and less on writing device drivers. With Windows 3.0 Microsoft completely rewrote the application development environment. The OS included Program Manager.

 


Windows 95 (August 1995)

A major release of the Microsoft Windows operating system that caused Apple’s Market share to decline or shrink was Windows 95. Windows 95 as the name suggests was released in 1995 represented a significant advance over its precursor, Windows 3.1. By the way, this was also the time when the first version of Microsoft’s proprietary browser – Internet Explorer 1 was rolled out in August 1995 to catch up the Internet wave.


Windows 98 (June 1998)

Described as an operating system that “Works Better & Plays Better, ‘Windows 98’ offered support for a number of new technologies, including FAT32, AGP, MMX, USB, DVD, and ACPI. Also, it was the first OS to include a tool called Windows Update. The tool alerted the customers when software updates became available for their computers. 

 


Windows ME – Millennium Edition (September 2000)

The Windows Millennium Edition, referred as “Windows Me” was an update to the Windows 98 core that included some features of the Windows 2000 operating system. The version had the “boot in DOS” option removed but included other enhancements like Windows Media player and Movie Maker for basic video editing.

 

Windows 2000 (February 2000) 

W2K (abbreviated form) was an operating system for business desktop and laptop systems to run software applications, connect to Internet and intranet sites, and access files, printers, and network resources. Windows 2000 4 versions released by Microsoft:

1- Professional (for business desktop and laptop systems)

2- Server (both a Web server and an office server)

3- Advanced Server (for line-of-business application.

4- Datacenter Server (for high-traffic computer networks) 


 


Windows XP (October 2001)

This version of the OS was built on Windows 2000 Kernel and was introduced in 2001 along with a redesigned look and feel. It was made available to the public in 2 versions:

      1-   Windows XP Home

      2-    Windows XP Professional

Microsoft focused on mobility for both editions, including plug and play features for connecting to wireless networks was introduced in this version of Windows, and it proved to one of Microsoft’s best-selling products. Its use started declining with more Windows 7 deployments.

 

Windows Vista (November 2006)

A marketing flop! People expected too much from its WOW factor. Windows Vista released in November 2006 was widely criticized for performance related issues.


Windows 7 (October 2009)

Windows 7 made its official debut on October 22, 2009. The OS included enhancements in the form of fast start-up time, Aero Snap, Aero Shake, support for virtual hard disks, a new and improved Windows Media Center, and better security features. 

 

 




Windows 8

Bill Gates’ vision of future computing was Touch and voice replacing mouse and keyboard. We already have the touch with Windows 8, a completely redesigned OS built from the ground up.


Windows 8.1

Windows 8.1 changed a few things for the better which were found wanting in Windows 8.

Notable changes included a visible Start button, improved Start screen, Internet Explorer 11, tighter OneDrive integration, Bing-powered unified search box, the ability to land on the desktop on login instead of the Start screen. 

 





Windows 11

Windows 11, released in 2021, has all the features, power, and security of Windows 10. The primary difference appears to be a redesigned desktop and the Settings menu. But apart from this, there are several other new features under the hood.