Preview | 2008.06.11

Naturally I will be indulging my curiosity as to the effectiveness of a good stock market prediction network. It would be a shame not to put money where my mouth is.


I think the greatest opportunity to improve the algorithm is to provide as many reasonable prediction inputs as possible. The network will be good at analyzing historical data, but it will be great at merging this with informed opinions. After all, share value is as dependent on the present and future as it is on the past.

Finding sources for predictive information is no easy task. Most estimates are quarterly, far too low in granularity.

One concept that has gained widespread public attention is the prediction market. The idea is that the informed opinion will prevail, and it's being applied most notably to the upcoming elections.


Prediction markets use currency, conveniently so does the stock exchange. Prediction markets apply economics and Darwinism to solve complex problems, neural network weight resolution is not much different.

So it follows that I could enlist a number of investors (using simulated money, at first) and query each for a market prediction. These values would be used as inputs to the neural network and, with many other factors, determine the predicted share values. Each day the network would retrain itself and add weight to the more accurate voices.


The three components of the algorithm: the network with investors as inputs, the short and long term stock value mapped to buy/hold/sell transactions, and the investor attributes.
Each investor would be driven to produce accurate estimates as the collective account balance would hinge on their prediction. Moreover, if the quarterly gains were distributed based on each investor's prediction performance, the highly dedicated would be rewarded.

This model should easily overcome the best case scenario of a single informed investor. Not only does it weigh and balance more factors than a human can individually process, but it sorts out bias toward or against particular stocks.




2008.06.11

Neural networks

2008.06.24

Rifftrax II

The Aeon Flux Rifftrax.


Related / internal

Some posts from this site with similar content.

Post
2008.06.11

Neural networks

I was chatting with Jon about the application of neural networks to stock trading, which is basically a perfect example for explaining the science. It went something like a'this:
Post
2018.05.13

Deep dive

Neural style transfer using DL4J and other starter projects. An ATL trip and two hours before the mast.
Post
2019.05.23

Thumbnailing

Naive thumbnail generation in Java.

Related / external

Risky click advisory: these links are produced algorithmically from a crawl of the subsurface web (and some select mainstream web). I haven't personally looked at them or checked them for quality, decency, or sanity. None of these links are promoted, sponsored, or affiliated with this site. For more information, see this post.

coornail.net

Teaching an AI to count

Neural networks are a powerful tool in machine learning that can be trained to perform a wide range of tasks, from image classification to natural language processing. In this blog post, well explore how to teach a neural network to add together two numbers. You can also think about this article as a tutorial for tensorflow.
Has a preview image link and yet 404 :/
www.paepper.com

Do you know which inputs your neural network likes most? :: Päpper's Machine Learning Blog - This blog features state of the ...

Recent advances in training deep neural networks have led to a whole bunch of impressive machine learning models which are able to tackle a very diverse range of tasks. When you are developing such a model, one of the notable downsides is that it is considered a "black-box" approach in the sense that your model learns from data you feed it, but you don't really know what is going on inside the model.
marcospereira.me

Backpropagation From Scratch-Marcos Pereira

In this post we summarize the math behind deep learning and implement a simple network that achieves 85% accuracy classifying digits from the MNIST dataset.

Created 2025.06 from an index of 775,016 pages.