This article hopes to answer the question so that we walk away understanding the benefits of genetic optimization. We'll define some terminology, explain why we should care about GAs, and describe some benefits and drawbacks to these biologically-inspired optimization algorithms.
|
If you're someone who values efficiency and is always seeking ways to improve—be it maximizing your investment returns, cutting down on wasted time, or boosting your productivity—then you should definitely care about genetic algorithms. These algorithms are game-changers in a variety of fields, from helping doctors make more accurate diagnoses to optimizing financial portfolios for better returns. They come in especially handy for tackling 'real-world' problems that are messy and complicated, where traditional methods might struggle. In the following sections, we'll dive deeper into why Genetic Algorithms excel where others fall short.
|
Before we get too deep in the weeds, I wanted to create a glossary of common terms that are discussed when talking about genetic optimization. A lot of the times, these terms look scarier than what they actually are, and are used to intimidate people out of learning an interesting, important, and fun subject. Some of these terms are:
- Optimization: The act of finding the best way to do something. For example, if you're trying to save money, optimizing might mean finding the best deals and discounts so you can keep more cash in your pocket.
- Algorithm: A recipe or a set of steps for achieving a particular goal. Think of it like following a recipe to bake a cake: you have a list of steps to follow in a specific order.
- Neural Networks: These are a type of artificial intelligence that learns by adjusting internal settings, kind of like tuning a musical instrument to get the right sound. They're often used to recognize patterns in data and are guided by a process called "gradient descent," which is a fancy way of saying they keep adjusting until they make the fewest mistakes.
- Reinforcement Learning: This is a type of machine learning where a computer program learns to make decisions by performing actions and receiving rewards or penalties based on the outcomes.
- Objective (Fitness) Function: This is what you're trying to improve when using an algorithm. In the context of investment, for example, you might be trying to get the highest return for the least risk, which can be measured by something called the "Sharpe ratio."
- Individual and Population: When generating solutions with a genetic algorithms, you determine the number of solutions that you want to generate. Each solution is an individual, and the collection of all solutions is a population
- Mutation: A small, random change made to a solution to keep the algorithm from getting stuck in a rut. It's like rolling the dice to add a little unpredictability.
- Global Optima: The absolute best solution among all possible solutions. It's like winning a gold medal at the Olympics—not just being the best locally, but the best in the world.
- Local Optima: A solution that's better than all its neighbors but not necessarily the best overall. It's like being the best player in a local sports league but not necessarily in the entire country.
- Stochastic: A process involving some level of randomness or chance. Think of it like the shuffle feature on a music playlist.
- Computation: This is the process of your computer running mathematical calculations to solve some problem. If an algorithm is computationally expensive, that means it takes a lot of power to run.
|
As I mentioned in the introduction, the purpose of this article isn't to discuss the technical implementation of genetic algorithms. If you are looking for that, check out this very detailed technical guide that I wrote on the matter. In that article, I glossed over one important question – why use genetic algorithms, especially over other optimization methods? What are the concrete benefits of it, and what do I hope to accomplish when optimizing a portfolio?
|
Genetic algorithms are a biologically-inspired optimization algorithm that have several benefits compared to other algorithms, including neural networks. I will discuss some of these benefits below.
|
Capable of optimizing multiple objectives simultaneously
|
Genetic algorithms have this magical ability to optimize any arbitrary objective function. It does this by not relying on gradients, something we'll discuss in greater detail in the next section. This means that we're able to optimize for any and all objectives, often at the same exact time. This is magical, and something that neural networks cannot do.
|
Works well on non-differentiable functions
|
Picture a skateboard ramp that's really smooth. You can skate up and down without encountering any bumps, holes, or sharp turns. On this kind of ramp, you could figure out the 'slope' or steepness at any spot, meaning you can predict exactly how your skateboard will tilt at each point as you skate along. This type of ramp represents what mathematicians call a 'differentiable' function, where you can smoothly trace the curve without any sudden changes.
|
Most importantly, with this skateboard, it is possible to find the minimum point by utilizing an optimization algorithm called gradient descent. Finding the minimum point, or the global minima, is usually the goal of these optimization algorithms.
|
Now, imagine a skateboard ramp that has a giant hole right in the middle. Because of this hole, you can't define the 'slope' or steepness at every spot on the ramp; when you reach the hole, there's no surface to measure. This makes it difficult, if not impossible, to find the lowest point on the ramp using methods like gradient descent, which rely on having a smooth, predictable surface to work with. This kind of ramp is what mathematicians call a 'non-differentiable' function, where you encounter abrupt changes or gaps that prevent smooth tracing of the curve.
|
Genetic algorithms excel at optimizing differentiable and non-differentiable functions. If the objective we're trying to optimize is non-differentiable, like for example maximizing the sharpe ratio and minimizing our drawdown, then genetic algorithms do a far better job than standard neural networks. They also don't require extensive labeling like classification algorithms, and converge more rapidly and consistently than reinforcement learning algorithms.
|
Thorough search of the solution space
|
Genetic algorithms may not guarantee the absolute best ('optimal') solution, but they are excellent at finding solutions that are 'good enough' for most practical purposes. One of the ways they do this is through a feature called the 'mutation operator,' which introduces random changes in the set of possible solutions, much like genetic mutations introduce variations in real-world populations. This element of randomness, known as 'stochasticity,' ensures that the algorithm doesn't get stuck and keeps exploring a wide range of potential solutions.
|
The benefit of this approach is that it provides a broad sampling of the 'solution space,' offering multiple viable options. This is particularly useful in real-world scenarios where finding a 'good enough' solution quickly is often more important than spending a lot of time and resources to find the 'perfect' answer."
|
Doesn't just generate one solution; generates multiple good solutions
|
One of the most interesting parts aspects of genetic algorithms is their ability to generate a variety of 'decent' solutions, especially when you're trying to balance more than one goal. Let's say you're optimizing an investment portfolio based on two criteria: the Sharpe ratio, which measures your return on risk, and the maximum drawdown, which tells you the biggest loss you could face. Genetic algorithms will churn out a diverse set of portfolios, each with its own set of trade-offs.
|
Some portfolios might have a high Sharpe ratio but also come with a higher potential for loss (max drawdown). Others might offer lower risk (lower max drawdown) but also give you less bang for your buck (lower Sharpe ratio). This range of solutions lets you pick a portfolio that best fits your personal comfort level with risk and return. It's like shopping for a car where you can decide the balance between fuel efficiency and horsepower. Most solutions will fall somewhere in between the extremes, giving you the flexibility to choose based on what you value most.
|
Drawbacks of Genetic Algorithms
|
Despite all of its advantages, GAs are far from the "perfect" algorithm. There are several drawbacks to genetic optimization that may make you reconsider using them for your use case.
|
Computationally Expensive
|
Running a genetic algorithm involves generating tens to hundreds of individuals, combining individuals with each other (crossover), randomly altering these individuals (mutation), and then evaluating each individual's performance in a backtest (evaluation). All of that to say is this is a lot of steps for the computer to do. It's very computationally expensive to run a genetic algorithm, particularly with high population sizes which are typically preferred.
|
Depending on the exact parameters you use to run a genetic algorithm, it may be prone to getting stuck in local minima. Mutation operations can help alleviate this, but it still a well-known drawback of genetic optimization.
|
GAs are stochastic which means you'll get different results for each run, even if you start with the same initialization settings. Because of this non-determinism, it's possible that each time you run a genetic algorithm, you get a different result that you aren't able to reproduce. This stochasticity is beneficial when avoiding local minima, but can be an issue when running rigorous, robust experiments where reproducibility is a concern.
|
Arguably the biggest drawback to genetic algorithms is their risk of overfitting. Because of how they work, GAs literally alter their parameters to maximize performance in a backtest. This is inherently prone to overfitting; we may be able to match patterns in a backtest that don't really exist in real-time trading. It's a known fact that markets are constantly changing and that past behavior doesn't necessarily predict future results.
|
While not minimizing the real risks of overfitting, there are several methods we can use to prevent this. The classic way is splitting the backtest data into two sets . One set is the training set – the portfolio are is optimized using this subset of data. Then, we test our portfolio's performance on the second set – the validation set. While this method doesn't inherently prevent overfitting, it does help us understand how likely our algorithm is to being overfit. A strategy that performs exceptionally well on the training set and poorly on the validation set is almost certainly overfit. However, if a strategy has similar-ish performance in the training set and validation set, we can have some level of confidence that the portfolio will perform similarly in real-time trading.
|
Another way of preventing overfitting is by running backtests on certain windows of the training set, and then using the average performance to optimize our objective functions. By making this change, we're no longer fitting our data onto the entire training set, but rather an average of how the market performed in the past. This seems to reduce the risk of overfitting, but more experiments, such as this one, are needed to make definitive a conclusion.
|
Aside: NexusTrade Referral Program
|
The optimization engine is one of the most powerful and unique features that NexusTrade has to offer. However, it's very computationally expensive. Thus, we're doing a slow launch for a small subset of power-users on the platform.
Do you want early access? Here's what you have to do!
- Confirm your email address by going to the profile page.
- Invite your friends with your unique referral code.
- Make sure your friends confirm their email addresses.
- Email us at nexustrade@starks-technology.com saying that you want early access to the optimization engine.
|
What are the exact rewards? They are outstanding!
- Invite 5 2 friends for 1 week of access 😲
- Invite 10 5 friends for 1 month of access 😱
- Invite 25 15 friends for 6 months of access 💀
- If you invite more than 15 people, you will receive a special invitation from the creator of NexusTrade, Austin Starks 🥳
Referring your friends is the ONLY way to get access to these premium features for FREE.
|
Genetic algorithms are a robust solution to optimizing arbitrary objective functions. These algorithms are able to optimize non-differentiable functions, unlike other algorithms such as gradient descent. They're also able to search the solution space effectively, and generate an entire population of solutions that are at least decent.
|
However, genetic optimization is not without its flaws. It's very computationally expensive to run a GA. Being a heuristic, there's no guarantee that a Genetic Algorithm will find the best set of solutions. They may also converge prematurely. Lastly and most importantly, GAs are prone to overfitting, so even though we may find good results in backtest results, these results may not translate to real-time trading.
|
|