International Baccalaureate IB Computer Science

A.4.3.1 Linear Regression Overview
Explain how linear regression is used to predict continuous outcomes.
- The relationship between the independent (predictor) and dependent (response) variables
- The significance of the slope and intercept in the regression equation
- How well the model fits the data—often assessed using measures like r[2].
A.4.3.2 Supervised Classification
Explain how classifications techniques in supervised learning are used to predict discrete categorical outcomes.
- K-Nearest Neighbours (K-NN) and decision trees algorithms to categorize new data points, based on patterns learned from existing labelled data
- Real-world applications of K-NN may include collaborative filtering recommendation systems.
- Real-world applications of decision trees may include medical diagnosis based on a patient’s symptoms.
A.4.3.3 Hyperparameter Tuning & Eval
Explain the role of hyperparameter tuning when evaluating supervised learning algorithms.
- Accuracy, precision, recall and F1 score as evaluation metrics
- The role of hyperparameter tuning on model performance
- Overfitting and underfitting when training algorithms
A.4.3.7 Genetic Algorithms in Practice
Describe the application of genetic algorithms in various real-world situations.
- For example: population, fitness function, selection, crossover, mutation, evaluation, termination
- Real-world application: optimization problems such as route planning (travelling salesperson problem).
A.4.3.8 ANN Structure & MLPs
Outline the structure and function of ANNs and how multi-layer networks are used to model complex patterns in data sets.
- An artificial neural network (ANN) to simulate interconnected nodes or “neurons” to process and learn from input data, enabling tasks such as classification, regression and pattern recognition
- Sketch of a single perceptron, highlighting its input, weights, bias, activation function and output
- Sketch of a multi-layer perceptron (MLP) encompassing the input layer, one or more hidden layers and the output layer.