Category Archives: Science

Autonomous

Edward Eddinger
Ego and Archetype
Christ as Paradigm of the Individuating Ego

I myself have been saying autonomy and dealing with the loss of a friend who I projected my anima (that I identified as Hypatia) onto. This is very consoling to read.

“The end of this passage makes clear the purpose of inciting discord. It is to achieve the solitary condition, the state of being an autonomous individual. This can be achieved only by a separation from unconscious identification with others. In the early stages, the separatio is experienced as painful strife and hostility. Parents and family are the most frequent objects of unconscious identification. Jesus singles out the father for special mention:

And call no man your father on earth, for you have one Father who is in heaven.

Parents have power over their grown children only because the latter continue to project the images of the archetypal parents to their personal parents. To call no man father means to withdraw all projections of the father archetype and discover it within…

…’…if a man will lose his ego for my sake, he will find the Self.’ [paraphrasing Jesus]


‘It is no easy matter to live a life that is modelled on Christ’s, but it is unspeakably harder to live one’s own life, as truly as Christ lived his.’ [Jung]

…blessed are those who are aware of their spiritual poverty and are humbly seeking what they need. Understood psychologically, the meaning would be: The ego which is aware of its own emptiness of spirit (life meaning) is in a fortunate position because it is now open to the unconscious and has the possibility of experiencing the archetypal psyche (the kingdom of heaven).

Blessed are those who mourn, for they shall be comforted. Mourning is caused by the loss of an object or person who was carrying an important projected value. In order to withdraw projections and assimilate their content into one’s own personality it is necessary to experience the loss of the projection as a prelude to rediscovering the content or value within. Therefore, mourners are fortunate because they are involved in a growth process. They will be comforted when the lost projected value has been recovered within the pysche.”

Sex and the Office Power Moves

I think the reason I hypothetically upset J***, C******, and T** was in their eyes they are the gatekeepers to everyone’s hope, an opportunity at full time.
So if I came in there and said I was ok with getting laid off cuz I’d still get my masters, was an insult to their opportunity which they viewed as the best they could offer anyone (exposure to CICD and C suite). They wanted me to integrate there with my masters.
I truly believe the reason everyone acted the way they did by turning a blind eye to the affair was its expected in Abrahamic patriarchy culture to objectify women. Even S***** acknowledged that by turning a blind eye to my return (this I didnt know at the time) because she exhumes that lifestyle. However with me I did feel guilty and was violating the patriarchy’s unspoken rule of objectification is expected and simultaneously violated the republican rule “keep it yourself” to scandal.
“Sex and the Office” says women caught sleeping their way to the top are halted generally once outted.
I think it’s okay to have sex in the workplace but the way its discusses imo is from a patriarchial sanctioned “don’t ask don’t tell” POV.
Which creates this anxiety in people about it and it remains very much a power move.

Selected Response: “that is absolutely true. But desperate times man… I’ve certainly been presented with the opportunity to do so, but don’t want to subjugate myself to whims of some dude on a power trip.”**

At that point all hell broke loose coupled with the fact I didn’t care for the opportunity* and was creating scandal with talk of dominatrix, I think that’s when the uppers gave their blessing to let those angry at me have at it. They did what is always done in criminal cases. Try to get former friends to turn on each other, let them do the dirty work for you. To get them to show they are hungry for an opportunity and willing to get their hands dirty. That’s my best guess.
*I thought they misinterpreted me, that I didn’t want it. I did its just that I was truly mentally ill from seeing her and it was hurting me psychologically with insomnia and mania daily. But they didn’t care. It violated their world view which expected me to be a player about it. But that’s the thing. I’m a soft heart (I can be mean, though usually unintentionally by usually just walking away from a relationship (ghosting), ironically, but lately I’ve tried to be more mature about it and am on the other end of the spectrum where I can’t walk away). Idk I just can’t be mean to someone I loved (I can be passive aggressive and make things uncomfortable though). I can’t block it out. That to me is being disingenuous to oneself and I feel corporate life expects you to swallow your emotions because they wanted to prove to everyone that emotions won’t get you paid
It’s kind of like saying women want to be respected but then you see classic objectifiers fucking the women you admire at work… I had a white knight savior complex only to be friend zoned and discarded which taught me that the paradigm I was using does not get me to the desired end goal (nice guys finish last) which was at least continued friendship so I had the affair and fulfilled the expectations of the very thing I didn’t like. Patriarchy. It was very traumatic for me in a way
It’s a turn on to be objectified but people want to use it in certain contexts that benefit them (cost benefit analysis)
**My friend confirmed it is a power move and some women think they have to sleep their way to the top
So power is attraction
I got the distinct impression she was doing it for power because people would tell me she got around, plus the fact that she was giving me personal time, yet when I naively told her I had feelings for her (after hanging out 1:1 multiple times). She started to distance herself. Like I was breaking some unspoken rule. I think she then classed me as a nice “married” guy who wasn’t going to make a move and lost interest cuz I obviously wasn’t a power move nor a Don Draper to her anymore

I think I touched the 3rd rail. Whoever obliged her power moves.

Attraction is power.

Medical Work Sample Regression Results

Ask

This image has an empty alt attribute; its file name is image.png

6th Try (Sensitivity and Specificity same algo, 2 separate runs flipped bit)

Sensitivity: 97.4%

Found another single variable that accounts for 97.4% of my samples diagnostic’s

End State Renal Disease

Cut off was .43

I modified my algorithm to find the cutoffs from the training partition vs the cross validation test partitions.  I was still trying to solve for specificity, but alas, it converges on sensitivity.  

I’m not overly worried about it.  I can always recode the response variable and converge on sensitivity.

Specificity

Okay… so I tried or sensitivity and it converged on specificity.

The only thing I can think of is changing to training cutoffs vs cross validation test partitions for cutoff was it.

I use a function optimalCutoff and a var optimizeFor = Zeros or Ones depending on a flag at the beginning.  

I also check for specificity or sensitivity on confusionMatrix output based on this flipped flag.

Anyways… if I flip this flag, it does either sensitivity or specificity. So that is working.  Why it’s inverted from the default parameters… still not sure

But this IS better that it ALWAYS converging on sensitivity.

The reason for the slightly different results each pass is due to imputed variables I suspect and my static spss dataset which is an output of just one imputed set?  I use the same seed (poor programmming practice I know, but data science is supposed to converge on the same results regardless of randomization, aka cross validation).  In this case not so much the factors, but the classification scores (confusion matrix results).

5th Try (Solved Sensitivity)

Note: “3rd Try” is my specificity model (I coded the 1’s and 0’s backwards and mistook it for the true sensitivity model I was looking for)

An even better model

  • Serum Creatine
  • PET

Sensitivity: 99.3%

Cutoff: .345035

4th Try

Optimizing for cutoffs

I do not understand why. But when I tell R to test for specificity, I converge on sensitivity

The Answer is 

  • BMI
  • ESRD
  • Diabetes.Mellitus
  • Liver.cirrhosis
  • Hepatitis.B
  • SOB
  • Coagulopathy
  • Constant Term
  • Cutoff: .475
  • Sensitivity: 93.5%

All metrics are derived from test partitions from cross validation (to include cutoffs). I’m hitting the ball all right.

3rd try (Solved Specificity)

  • Cross Validation
  • Binary Logistic
  • Categories

This is my 3rd try at that Medical Data problem and the ask was to test for sensitivity.

This is my final result

I hit the ball out of the park.

Problem with this: I forgot my 1’s were incorrectly coded for the class of non interest.

Optimized for sensitivity using cross validation 🙂

Code is saved on my private github. It was a lot of trial and error, but I got it.

2nd try

Using guide here: http://www.sthda.com/english/articles/36-classification-methods-essentials/150-stepwise-logistic-regression-essentials-in-r/

I initially tried cross validation using this hash matrix, but it’s still a WIP

Fundamental weakness: not optimized for sensitivity. Bugs in code. 2 level variables shouldn’t be factors.

Answer

SEX2                                  
Liver.cirrhosis2                      
Cough2                                
SOB2                                  
INR                                    
BAL.cytology2                          
BAL.cytology3                          
EBUS.TBNA..no.of.biopsies.            
Bronchoscopy.findings..EBL.vs.No.EBL..2

1st Try

I finished a work sample challenge for medical data
I even imputed data!

Yeah, I consider myself a data scientist

70% accuracy

Fundamental weakness: didn’t use factors

Image may contain: text

Income Regression Model based on State Features

#DataScience

Data is based on 2007 Statistical Abstract of the United States

I’ve thoroughly analyzed my “best” cross validated model and further pruned the model using backwards stepwise regression on the final dataset from the cross validated term algorithm and did model diagnostics and highlighted in green variables that had positive residuals marked out and red for negative residuals (for further analysis)

This is the type of work I was thinking about publishing

This model is the “Income” model.  I’ve included quadratic terms and got an amazing MAPE of 2.65% and an Adjusted R^2 of .97

I’ve noticed the cross validation pruned the hierarchical dependency (crime).  I’m not sure what to make of that atm but I trust it knowing the MAPE was cross validated.  I know I can exclude INTERACTION hierarchy dependencies (unsure about quadratic), but I know Crime is also captured in the interactions, so technically maybe that’s why it’s showing significant.

I’ve included studentized residuals and mapped it to cook’s distance which gives a great view of outlier’s.

I would give my stamp of approval on this model and say it passes the 4 model assumptions

State’s of Note

* Alaska

* Arizona

* Connecticut

* Illinois

* Lousiana

* California

* Maryland

Backwards Best Subset Cross Validation including interacted terms

Some best formula inferences

  • Income,Poverty,White,Unemployed,Doctors*Infant.Mort,Doctors*Traf.Deaths,Doctors*Unemployed,Doctors*University,Infant.Mort*Unemployed,Infant.Mort*White
    • mape of 4.37%
  • Poverty,Crime,Traf.Deaths,Unemployed,Crime*Infant.Mort,Crime*Unemployed,Crime*White,Doctors*University,Income*Infant.Mort
    • MAPE of 6.57%
  • University,Poverty,Infant.Mort,White,Crime*Income,Doctors*Traf.Deaths,Doctors*Unemployed,Doctors*White
    • MAPE of 7.26%

I used concepts from Backward stepwise regression to find the best set of factors for inclusion in regression. I got the idea from p values, but I wanted to use cross validation. There is a post that does exactly this, but I realized the results might not be the best CV scores. So I coded a loop, a few actually. This has been an idea I’ve been working on on my linkedin for a while, going back and forth trying to derive millions of combinations of factors (interactions, quadratic terms, etc) to test.

I came up with the most ingenious solution for the best cross validated formula using interactions.

1. Derive all interactions.

2. Develop Cross Validation train/test splits

3. Build full model

3a. For each full model, subtract 1 variable. Find model that performs best over all folds.

3b. Repeat 3 until model does not improve anymore. That is the best formula.

It’s kind of like genetic algorithm, except no mutations.

I got the idea from backwards selection. It’s also a lot easier to code than trying to build a manual backwards selection.

I realized overfitting all terms wouldn’t be an issue as the validation partition wasn’t used for training. So variables that were not meant to be in the equation will stand out the most.

I imagine something similar could be done for forward regression. Start with the single most powerful variable and work your way forward.

Github (private): https://github.com/thistleknot/multipleCorrelation/blob/master/interactionsBackwardsCrossValidationBestSubset.R

Code: https://hastebin.com/sapixopete.bash

Outputted scores: https://hastebin.com/polamefugo.css

data (states.csv): https://hastebin.com/apuvusevir.css

Note: the source call to mape.r can be commented out

Cross Validation over every factorial combination

I learn and then apply

I dropped the self filtering method of deriving multiple R manually using matrix equations to using cross validation and then when the best model (set of parameters with the lowest RMSE) is chosen. I will then derive the final model against the full dataset.

I use combn to iterate over every combination of factors hashtag#dataScience

Multiple Regression Coefficients, Correlation Matrix, and Significance

I redid my Matrix Multiple Regression Spreadsheet so it’s easier to read and has less needless matrix multiplication outputs and less cluttered formula’s.

Includes

* Inverted Transposed Predictor Matrix * Predictor Matrix
* Covariance Matrix
* Inverted Correlation Matrix

I’ve

* reproduced all Data Analysis toolpack data
* didn’t have to use linest
* nor results from the toolpack
* nor any 3rd party plugins

file: https://drive.google.com/…/1eQcyp9hoqeJDkw8QfP-cjtV1i…/view…

Multiple Regression Coefficients and Significance Using Matrix Algebra

I’m quite proud of myself.  I spent the good portion of the day trying to figure out how to calculate p-scores by hand (by hand I mean in excel as opposed to using R or data analysis toolpack).

Granted.  I did use the analysis toolpack to derive the residuals, but that was to shortcut rather than using linest.  I DID derive the coefficients manually using matrix algebra, so I could have derived the y’s and then the residuals.  But since I wasn’t focused on that, but the t scores and the subsequent p values.  I went straight to that, because that’s what I want to model in R.

Now I can derive the P Scores of traincontrol in R 🙂

Next I’m going to add correlation matrix to this sheet, which is pretty easy since I have the covariance matrix

This file can be found in my uploads

and I got it!

https://drive.google.com/file/d/1mpgHHFwPEoU68Wy9LhYxJQrW88UR08Tb/view?usp=sharing

Chimera

Powerpoint

Report

Project homepage Readme

Using ICPSR polling data of 8th & 10th grade Americans. I transform from a set of predictor terms into what I call a “semiotic grid” of 1’s and 0’s which are then used to identify a class of 1’s and 0’s of desired outcomes of 3 specific response terms. GPA, gang fights, and (gasp) presence of psychedelic drug use.

I use monte carlo resampling to achieve class balancing and do a modified bestglm algorithm to get a wider set of terms via cross validation then through Cross Validated holdout analysis then tabulated. That’s just for initial factor reduction/pooling potential candidates. Then these terms go through more class balancing, cross validation once more using actual bestglm unmodified to arrive at a final regression formula as well as terms that are always population significant & closing with ROC.

I am offering the project as a type of open house to potential employers to determine if my skillset would be a good fit for what you hope to do with numbers.