AI and its relation to the concepts of information technology

 I chose A.I as my tech topic.

AI and its relation to the concepts of information technology and computer science, including the history of computers:

Artificial Intelligence can be traced back to the earliest computers, where pioneers like Alan Turing posed questions about machine intelligence (Turing, 1950). (Side note, did you know Alan Turing was arrested and chemically castrated for being a homosexual and all of that after his service in helping crack the enigma code the Germans were using to disguise their communications in World War 2 ultimately saving hundreds of thousands of lives?)

The advent of digital computers laid the foundation for AI, creating the possibility for a machine to simulate human reasoning. This is deeply connected to the central tenets of computer science, which involves problem-solving, algorithmic thinking, and the efficient use of computational resources (Brookshear & Brylow, 2020).

Reliance on the major hardware components and functions of a modern computer system:

AI and Machine Learning models require significant computational power for their operation. These models rely on central processing units (CPUs), and even more on graphics processing units (GPUs), for their number-crunching abilities. They also demand substantial memory (RAM) for storing intermediate data during computations (Goodfellow et al., 2016). Furthermore, specialized hardware, like Tensor Processing Units (TPUs), have been developed to better support AI. (Jouppi et al., 2017).

Usage of programming languages and program execution methods:

AI algorithms are implemented using various programming languages such as Python and Java. Python, with its extensive range of libraries like TensorFlow, PyTorch, and Scikit-Learn, is favored in the AI community (Russell & Norvig, 2020). The programming logic of AI involves multiple complex stages including data pre-processing, model training, validation, testing, and deployment - all of which fall under program operation methods.

Role of application software in regards to AI:

Application software plays a crucial role in AI by providing an interface where AI can be used to solve real-world problems. For instance, recommendation systems in streaming services, voice assistants in smartphones i.e Alexa, Siri and android equivalent, and diagnostic tools in healthcare all utilize AI via application software to enhance user experiences (Russell & Norvig, 2020).

Relation to the basic concepts of database and database management:

AI heavily relies on data. Databases and Database Management Systems are used to efficiently store, retrieve, and manage the vast amount of data required for training AI models. Techniques like data mining and data warehousing are commonly employed in AI to extract valuable insights from databases (Date, 2018).

Influence of the basic concepts of network architecture, management, and security on AI:

AI is increasingly being deployed in networked environments, from cloud-based AI services to edge AI on Internet of Things (IoT) devices. Network architecture and management hence play a vital role in the deployment and functioning of AI systems (Sharma et al., 2017). AI itself is being used to enhance network security, by detecting and reacting to anomalies and potential threats (Buczak & Guven, 2016). (Side note, AI has been used to detect other AI and AI related work. For instance, AI models like Chat GPT-3 or Chat GPT-4 generate text by predicting the next word in a sequence, given the previous words. While these models can generate impressively coherent and realistic text, there may be certain giveaways that suggest the text was not written by a human. These could include repetitive structures, odd phrasings, or an overuse of certain words or phrases, as AI models don't have the same natural sense of language variety and creativity that humans do.)

References:

Brookshear, J. G., & Brylow, D. (2020). Computer Science: An Overview. Pearson.

Buczak, A. L., & Guven, E. (2016). A survey of data mining and machine learning methods for cybersecurity intrusion detection. IEEE Communications Surveys & Tutorials, 18(2), 1153-1176.

Date, C. J. (2018). Database system concepts. Pearson Education India.

Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT press.

Jouppi, N. P., Young, C., Patil, N., Patterson, D., Agrawal, G., Bajwa,

Comments

Popular posts from this blog

Understanding Java Installation and Object-Oriented Design Principles

Week 3 Post 1 Revision