During the first half of the 20th century, many scientific needs on computing, broadly, a term describing any goal-oriented activity requiring, benefiting from, or creating computers, were met by increasingly sophisticated analog computer, a form of computer that uses the continuously changeable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved; which used a direct mechanical or electrical model of the problem as a basis for computation, or any type of calculation or use of computing technology in information processing. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.
Alan Turing, a British mathematician, logician, cryptanalyst, and computer scientist, is widely regarded as the father of computer/computing science (“Comp Sci,” or “CS”), the scientific and practical approach to computation and its applications. In 1936, Turing provided an influential formalization of the concept of computation and the algorithm, which in mathematics and computer science is a step-by-step procedure for calculations, with the Turing machine, a device that manipulates symbols on a strip of tape according to a table of rules, proofing a blueprint for the electronic digital computer.
Of his role in the creation of the modern computer, Time magazine, an American weekly news magazine published in New York City, United States (US), in naming Turing one of the 100 most influential people in the 20th century–“Time 100: The Most Important People of the Century” is a compilation of the 20th century’s 100 most influential people, published in Time magazine in 1999–states: “The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine.”