Look deep into nature, and then you will understand everything better. -Albert Einstein

**M**odern day world has already seen many advances in the arena of computation, performance efficiency, and will definitely see many more of these in the future as well. Ranging from the ancient concept of simple abacus to the contemporary supercomputers, many things have changed and have contributed to some of the greatest pioneering works and achievements. But do these systems have the properties of self-repairing, reproduction or are they able to work with noisy and inconsistent data having huge dimensions and sizes, or can they obtain the best output for a problem when the search domain involved is very huge? The answer to this question is ‘NO’. Even if the best technology is chosen and the best resources are employed, still a system with all these powers cannot capture the explosive growth of the search space with large inputs. On the contrary if Mother Nature is considered, all of its constituents come together- from the seemingly insignificant micro-organisms to the much bigger beings comprising flora and fauna- and provide the capabilities to self-repair, to identify and process noisy or inconsistent data, to reproduce its own kind, and to even find the most suitable places for habitat, and similar day to day activities wherein the search space is as large as entire earth. Human beings as a specimen of nature’s masterpiece have many capabilities that even the most modern day system can only dream of. The hindrances in the path of technology have provided the base for searching for newer algorithms that can imitate nature and develop systems that try to fill the above lacunae. The applicability of these systems is fruitful in all those scenarios where the conventional systems fail to produce acceptable solutions due to gigantic search space and time constraint.

Trailing back in the history, the first formal development of such algorithm could be found in the documents framed by the great mathematician and cryptanalyst, Prof. Alan Turing in 1945 and 1948 at Bletchley Park, London and National Physical Laboratory, UK respectively. He called his search algorithm as Heuristic i.e. yield from trial and error, which can generate the correct result but the correctness is not guaranteed. Later in the year 1962, the most pioneering work that paved its way for further developments was the invention of *'Genetic Algorithm'* by John Holland and his collaborators.

This algorithm imitated the natural selection based on *“survival of the fittest strategy”* by Charles Darwin, and proved to be an abstract model for solving many optimization problems. The algorithm used novel operators like crossover and mutation along with proper a mechanism of selection. This algorithm turned out to be a boon and solved many optimization problems (including multi-modal and constraints optimization problems) and even to deal with noisy data situations and discontinuous functions. Although it might happen that the algorithm does not always generate the exact solution, but it has been observed that it provides near optimal solutions even to the problems which cannot be optimally solved in polynomial time (such as NP-Hard problems).

Further such works based on emulation from nature include evolutionary strategy by Ingo Rechenberg and Hans-Paul Schwefel in the year 1963, and the development of Ant-Colony Optimization by Marco Dorigo in the year 1992. From then on, many new nature-inspired algorithms like Differential Evolution, Particle Swarm Optimization, Firefly Algorithm, Cuckoo Search, Bat Algorithm, Flower Pollination Algorithm, Chemical Reaction Based Optimization got derived. On being thoroughly tested, these algorithms proved their immense capability to solve many single and multi-objective optimization benchmark functions.

All of these algorithms clearly have an ability to evaluate the best solution by hunting the search space in parallel fashion and choose the further solutions depending upon the previously obtained best results. This improves the capability of the algorithm to quickly provide acceptable results in polynomial time. Since the correctness of these algorithms is questionable, their usage is limited to conditions where a certain amount of error and uncertainty can be tolerated. The question that arises out of these algorithms having a common root is which one is to be prefered, and when? This could be answered with the help of a very established theorem known as the 'No Free Lunch Theorem', proposed and proved by Wolpert and Macready. Simply put, the theorem tells that though the average performance of all these algorithms are the same, the difference lies in their approach towards solving the problem. As different types of problems bear different levels of diversity and complexity, a single algorithm cannot provide solutions to all of them. Clearly, it can be stated that all types of problems cannot be solved by any single nature inspired algorithm and thus there is a need of many algorithms.

But how to measure the capabilities of any such algorithm? To judge this, there are two typical concepts that gauge how well the algorithm is going to perform. The first measure is diversification, and the second is intensification. Diversification is the capability of the algorithm to explore the search space, whereas intensification refers to the capability of searching the local region by exploiting the already obtained good solutions. Many changes have also been made to these existing algorithms to solve different kind of problems which have led to various types of encoding schemes, various types of operators and ultimately numerous advanced versions of the same algorithm.

From the very advent of the concept of nature-inspired algorithms, researchers have been keenly thinking of all the different applications of these algorithms. Although most of them were used in research oriented fields only, they have now spread their wings in all domains starting from computer science, mechanical and structural engineering to the fields of business management and administration. It has been observed that many well-established companies run Genetic Algorithm as a daily routine to solve many tough combinatorial problems of data mining, planning, scheduling, image-processing and gaming. Nature-Inspired algorithms have a huge applicability in many other interdisciplinary fields also. Algorithms like Genetic Algorithm, Particle Swarm Optimization, and Cuckoo Search etc. have already been used to optimize many real life problems like design of welded beams, springs, in the fields of signal processing, network communications etc., and it has been observed that these algorithms outperform many traditional approaches in terms of time and resource consumption, but still a lot is left. Till now most of the algorithms are very problem specific and there are a huge number of problems that cannot be solved by a single algorithm. It has been seen through rigorous testing, that at times these algorithms can suffer from the problems of premature convergence and get stuck in local optima. It has also been observed that most of these algorithms tend to evaluate the same solutions repeatedly and leave many untouched. Thus, there is a huge scope to improve or devise new algorithms, by observing the virtues of the nature, which can mitigate these gaps.

In conclusion, it can be said that Mother Nature has strange ways to optimize its processes and their imitations have bestowed us with wonderful techniques to solve complex problems which would have otherwise been very difficult to solve. Since one algorithm does not have answers to all the problems, one should keep on looking for more and more techniques by pondering on nature-inspired phenomenon. Perhaps, the best of these techniques is yet to be discovered. Therefore, one should expect such newly discovered algorithms in near future.