Category Archives: Economy

Identified lagged stock variables including SP500

It looks like I [marginally] beat the SP500

LQD

I found the optimal significant lag of various variables and did an upper/lower (i.e. >0 daily delta, or <0 daily delta hold signal for SP500 return) graph for each variable and compared it to the SP500

This was the one that showed promise because it was daily and it gave me a good result.

I actually did my homework and used a training partition to find the significant correlations, then carried these terms to the test partition and derived their Total Cumulative Returns and this one was the one that made the most sense.

Now I can apply this method to any symbol

=D

The blue is the sp500

Additional variables

#econometrics

Markowitz Profiles (Modern Portfolio Theory)

I’ve significantly improved on my Markowitz algorithm.

It’s a merge of two tutorials (but I’ve read 3). Currently I derive weights based on the optimal sharpe ratio. ATM I don’t take into consideration any interest free rate (i.e. its simply return/volatility).

I was trying two different solver methods. Minimize was taking a really long time, so I tried this other tutorial that made use of cvxopt and a quadratic equation (which I use to draw the efficient frontier rapidly). I was unable to find the optimal value based on sharpe ratio using the cvxopt solver… (it would only solve either the bottom or topmost portion of the line) so I looked at finquant and it uses a montecarlo of 5000 to derive the optimal points (i.e. max return, min volatility, and max sharpe ratio).

So I fell back on monte carlo to do it for me. i.e. “The path of least resistance” and the margin of error is acceptable.

The import thing is it’s very fast/ straightfoward (KISS). So I will use this to backtest with.

What I like about Markowitz is it assumes past behavior is a good model for future (t tests in stocks are used to determine if two partitions of returns are equal). It’s been said that yesterdays price is the best predictor for tomorrows. Same for returns.

Code: https://github.com/thistleknot/Python-Stock/blob/master/Markowitz2.ipynb

PCA Clustergram Violin Plots with Pairplot

Based on last census data (2010?)

I used Clustergram with PCA scaling to create the cluster labels and it did a great job of separating the features (i.e. unsupervised). The best performing k based on fitness was 2 groups (which definitely makes things easier to see)

I’ve arranged the violin plots next to each other for easy comparison.

This would be nice to have in a dashboard. To create custom segments based on various k

I would like to do ANOVA using TSS, WSS, BSS to confirm the populations are different.

Optimal SMA

I’ve perfected my tests

Algorithm uses moving windows, finds optimal sma that maximizes return tracking volume weighted price

Simply using the optimal sma strategy is best (as opposed to using mean reversion or macd) considering I look n days ahead (7) and calculating those optional indicators gets tricky (i.e. they are relative to the current and future date).

This shows two different time periods as starting points and the return I would have gotten

This can be derived by running the following notebook

https://github.com/thistleknot/Python-Stock/blob/master/single.ipynb

back tested returns

I followed the guide

“An Algorithm to find the best moving average for stock trading

and applied it to BTC and found the optimal SMA for a 1 week return.

The optimal SMA is 87 days and t test significant over test/training partitions (i.e. the return was the same).

Average return for holding for 7 days is ~4%

It’s currently above the SMA.

I think I can build portfolios

fbProphet, VWP, MACD, RSI, AutoML, FredData forecasting BTC 60% return in 1 year

I threw everything I had

* fbProphet predicting volume weighted price
* RSI
* MACD
* Augmented bbands
* FRED data (110 additional terms)
* autoML using the above (in rolling windows) as predictors

to determine next days price of BTC

I got a 60% return on BTC (500% in the last year)

I think its time to seriously consider that micro masters in finance.

Code: https://github.com/thistleknot/Python-Stock/blob/master/fbprophet%20with%20vwap%20and%20automl.py

EVWMA MACD strategy backtested

Avoided the crash (SP500)
BTC

I was very upset with the performance of my fbprophet moving window algorithm. I realized I wasn’t counting my orderbook held porfolio value in my final funds, but I still couldn’t get it to converge.

But… I had one more trick up my sleeve

Exponential Moving Average Volume Weighted Price MACD Indicator

So I wrote it up

Here’s a picture of it in action.

It has less drawdown than simply holding BTC

My system does proportional buys (25% of funds) and sells all when there is a negative trend detected

I read that EVWMA crossovers are very accurate and this uses no forecasting like arima or fbprophet so it’s much easier to calculate (no need for iterating rolling windows).

EVWMA Code:
https://lnkd.in/gyw2Z35

Orderbook Code:
https://lnkd.in/g2n-N94

All possible best models using ensemble comparisons (ANN, KNN, Random Forest, XGBoost, Elastic Net, Best Subsets) (2020-11-22)

This single file does everything you need to do in Machine Learning for non time series data. (Reposted after I did some major code cleanup). The Caret library is awesome!

What does it do?
* Derives interactions, and squared predictor terms. 
* Sets a training and holdout partition.
* Uses cross validation for all models
* Ensemble comparison: Reports RMSE of each model (caret/models handles the factor reduction!) and gives summary information on the best model
* Prints out best forecast compared with holdout
* Shows correlation of forecast/holdout as goodness of fit
* Iterate through every column of original dataset as the independent term

Models applied
* KNN
* ANN (Nueral network)
* Random Forest
* XGBoost
* Elastic Net
* Best subset using leaps

It runs very fast.

You can easily retrofit this for any of your own data and derive your own inferences

#machinelearning
Code: https://github.com/thistleknot/matrixMultiplicationRegression/blob/master/caret%20elastic%20nnet%20xg%20leaps%20models.R

ZCA Whitened Poverty Model

This is the beauty of a ZCA whitened model. No collinearity between predictor terms (i.e. 0 correlation on the correlation matrix).

Meaning it’s “what you see is what you get” in terms of those predictor terms contribution being clearly represented in the coefficients (vs PCA). Each term is also scaled to 1, so the standard error is equal for each term (but you have to use a ZCA Whitening matrix to convert back, and I just figured out how to do that this weekend using matrix multiplication). #datascience

What you see here is a 3d scatterplot of y (poverty) mapped to Income and Percentage of whites in the population (majority class).

This model was chosen as I could map up to 3 variables in a 3d scatterplot and the Adj R^2 was .74 (strongest 2 term model).

I have included a more “full” model in the bottom right (all significant) with an Adjusted R^2 of .847

The colors in the 3d scatterplot represent a tricut of poverty levels with red signifying least poor. You can clearly see high leverage and/or outlier values (Hawaii and Mississippi)

Based on 2010 census data.