IT (INFORMATION TECHNOLOGY)
Information Technology IT. Information
technology, as a rule, is called computer technology, which, in fact, brought
information technology to the modern level of development. including
using The concept of IT covers all computing and communication technology,
consumer electronics, television, and radio broadcasting.
In a broader sense, information technology is an extensive class of disciplines and fields of activity related to technologies for creating, storing, managing and processing data,
from the history of the
development of IT technology The origins
of IT history can be traced back to the late 19th century. We can say that the
use of information technology began on January 21, 1888, when a partial test of
the Babbage analytical machine passed and the Pi number was successfully
calculated. However, only half a century later, on May 12, 1941, the first
engineer programmable computer Z3, which possesses the properties of a modern
computer was taken out by a German engineer Konrad Zuse. Three years later, in
1944, the first American Mark I computer program was launched.
Related Fields:
Related Fields:
- Web Development
- java
- Database
- software Engineering
- Artificial Intelligence
- GW Basic
- c/c++
- c#
- VB.Net etc.
·
See also "Computer History" Antiquitira's machinery (c.
150-100 BC) Astrolabe (1208, Persia) Objects that help with computations have
existed since ancient times, such as the abacus and a kind of analog computer.
The first mechanical calculator to be described today as a "computer"
was built by Wilhelm Sickert in 1623 [4]. Charles Babbage designed a lab that
could be programmed during the Victorian era. In 1890, the punch card system
invented by Herman Hollerith was first used in the US Census [6]. Before the
1920s, the word "computer" was used to refer to a person who did
calculations as a job. However, in this era, modern-day computational theory
and models have been devised. Pioneers in the fields that will later be called
computer science, such as Kurt Gödel, Alonzo Church, and Alan Turing, have
found computability, a (special premise knowledge)
In the 1940s, as newer and more powerful computers were developed, the word "computer" became more of a machine than a human. From the 1940s to the 1950s, electronic computers were being built one after another, and at the end of the 1950s, the basic concept of modern computers (so-called built-in programs) was completed. As mentioned above, the punch card system was useful in the United States Census, as well as the calculation of numbers in science and technology (so-called numerical analysis), as well as more general office work. It has been clear for a long time that such machines are also useful for processing, but as a term that has a broader meaning than “narrow sense calculation” around 1960 [Note 1], mainly in the academic field
The idiom Information Processing began to be
used, and research on non-numerical applications, such as machine
translation and pattern recognition began. Also, mainly in the industrial
area, Data Processing, Data analyzing and many other fields.