Table of Contents

# What are Algorithms?

The method of solving a problem is known as an algorithm. More precisely, an algorithm is a sequence of instructions that act on some input data to produce the desired output in a finite number of steps.

In computer science, algorithms analysis is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them.

# DSA: Data Structure and Algorithms

Data structures and algorithms (DSA) go through solutions to standard problems in point and give you discernment into how efficient it is to use each one of them. It also introduces you to the science of considering the efficiency of an algorithm. This helps you to choose the best among many choices.

If you are willing to learn from basic to Advanced then go to this

Explore More Computer Utility Articles

Explore HTML & CSS Projects with Source code

## Advantages of Algorithms

It is easy to understand. An algorithm is a step-wise representation of a solution to a given problem. In Algorithm the problem is broken down into smaller pieces or steps hence, it is easier for the programmer to convert it into an actual program.

Where are algorithms used in real life?

Algorithms lie at the heart of computing. If we observe our surroundings, we can find several algorithms working to solve our daily life problems: Social media networks, GPS applications, Google search, e-commerce platforms, Netflix recommendation systems, etc. applications are powered by algorithms.

## What are the functions of algorithms?

- Input
- Output
- Finiteness
- Definiteness – clear and unambiguous
- Effectiveness – without applying any intelligence

# Algorithms Analysis

Analysis of algorithms is the determination of the amount of time and space resources needed to execute it. Usually, the efficiency or running time of an algorithm is displayed as a function relating the input length to the number of steps, known as time complexity, or volume of memory, known as space complexity.

## Why do we have required algorithm analysis in data structure?

Algorithm analysis is a necessary part of computational complexity theory, which provides theoretical analysis for the required resources of an algorithm to solve a distinctive computational problem. Analysis of algorithms is the determination of the amount of time and space resources required to execute it.

## What are the factors of algorithm analysis?

**Speed is one of the key parameters in determining the potential of an algorithm. There are some other factors, like user-friendliness, security, maintainability, and usage space, that determine the quality of an algorithm.** Space and time complexity are metrics used to measure parameters.

*“Just because we got the right answer(end) does not mean that the method (means) that we employed to obtain it was correct.The efficiency of obtaining the correct answer is dependent on the method employed to obtain it.Hence, scientific analysis of the performance of the method is very important.‘ends justify the means, doesn’t hold the good in computer science.” – Yashavant Kanetker*

## Why Analyze Algorithms?

Multiple algorithms may exist for solving a given problem. To determine which algorithm is more efficient than others, we need to analyze the algorithms.

The analysis is done by comparing the time or space required for executing the algorithms.

This analysis of algorithms gives us a scientific reason to determine which algorithm should be chosen to solve the problem.

## What to consider, What to Ignore?

For this, we must first identify which is the significant time-consuming operation(s) in the algorithm.

There are two classes of operations that are typically chosen for the significant operation comparison and arithmetic.

If you are willing to learn from basic to Advanced then go to this

Explore HTML & CSS Projects with Source code

Explore More Computer Utility Articles

## Cases to consider During analysis?

- Best case input
- Worst case input – this could involve comparing the key to each list value for a total of N comparisons.
- Average case input – to deliver average performance.

- Determining – the number of different groups into which all possible input sets can be divided.
- Probability – that the input will come from each of these groups.
- Determine – how long the algorithm will run for each of these groups. All of the input is each group should take the same amount of time.
- If they do not, then splitting the group into two is required.
- Average case time – the below formula will be helpful to find out the value for average case time –

A(n) = (1=<i<=m)Σ[Pi * Ti]

:: m = no. of groups

Pi = Probability that input will be from group i

Ti = time that takes for input from group i

# Rates of Growth

It is the rate of increase in operations as the size of the problem increases that is of more importance – called the rate of growth of the algorithm.

because small input sets can hide rather dramatic differences.

## Asymptotic Notations of Algorithms?

- Big Omega – algorithms that grow at least as fast as some functions.

all functions in this category grow as fast as g or even faster. f(n) > g(n) - Big Oh – algorithms that grow no faster.

the class of functions that grow no faster than g. f(n) < g(n) - Big Theta – algorithms that grow at the same rate.

a class of functions that grow at the same rate as the function g. f(n) = g(n)

## Is Asymptotic Algorithms Analysis Perfect?

Let two functions f(n) = 100nlog n & 2nlog n have rate of growth represented. Ignoring the constants order of growth of both algorithms would be nlog n. So both algorithms are asymptotically the same. Hence we can’t judge which one is better.

while this analysis we always consider input size n greater than some constant value n0. But in reality, we may never supply input bigger than n0. In such cases, an asymptotically slower algorithm may perform better than an asymptotically faster algorithm.

From the above examples, we can conclude that asymptotic analysis is not perfect but it still remains the best way available.

Hence, it is widely used while analyzing algorithms.

# Growth Rate Comparison

if anything is common cancel it out.

take log of both sides and then compare.

replace n with some large value of the power of 2.

compare the two functions.

“**If functions differ by differing by a constant value, then asymptotically they are the same; they differ only in actual value.**“

If you are willing to learn from basic to Advanced then go to this

Explore HTML & CSS Projects with Source code

Explore More Computer Utility Articles

# Types of Algorithms

Though the problems might be very different it is possible that the algorithms used to solve them are similar. for example, the two problems – counting elements in a list and checking whether a value exists in a list are different.

Still, the algorithms for both are very similar. Hence algorithms are often classified as per their characteristics rather than the problem that they are attempting to solve. Given below is a list of some common types of algorithms –

- Iterative algorithms
- Recursive algorithms
- Backtracking algorithms
- Divide and Conquer algorithms
- Dynamic programming algorithms
- Greedy algorithms
- Branch and bound algorithms
- Brute force algorithms
- Randomized algorithms

# Conclusion of Algorithm Analysis

- The algorithm is a method of accomplishing a task in a finite number of steps.
- An algorithm must have input, output, finiteness, definiteness, and effectiveness.
- Analysis of an algorithm involves determining time requirements or memory space requirements.
- The asymptotic analysis evaluates an algorithm’s performance in terms of input size. It calculates how time/ space increases with input size.
- Asymptotic notation describes 3 rates of growth big omega, big oh, and big theta.
- Usually, a big o analysis of an algorithm is done, as it determines the worst-case time complexity.
- The time complexity of a function can be found by determining the number of times the dominant operation is being performed in the function.
- The order of growth of two functions can be compared by taking log of functions and substituting a large value in place of n.