Advertisements

Skip to content
# Statquest

## An epic journey through computational methods for molecular geneticists.

# Gradient Boost Part 4: Classification Details!!!

##
8 thoughts on “Gradient Boost Part 4: Classification Details!!!”

### Leave a Reply

Advertisements

%d bloggers like this:

Hi Josh,

I am a fan of your music albums and statquest videos.

I need one help, if you are into deep learning. Can you post difference of all deep learning algorithms like Rnn, lstm, cnn, Ann, bidirectional rnn etc.

With best regards,

Abhishek

LikeLike

Josh,

I am SUCH A BIG FAN!!!!

I study applied math & computer science at BYU, and I think I’m close to finishing binge watching your entire channel.

I just finished writing a little website that does Thompson Sampling with cats. I feel like you’d think it was fun.

http://www.ericriddoch.info/thompson_sampling

I can’t wait to see all of the ML videos you will come up with.

– Eric

LikeLike

Awesome! I just tweeted about it. :)

LikeLike

Hi Josh,I intend to become as good as you in statistics and I have been religiously following your videos for the past couple of weeks,I would like to have some kind of suggestions to practice the problems and road map to further strengthen the grip over statistics.

LikeLike

I think the absolute best practice is to make friends with people who need data analysis. Working with them to solve problems and do statistics with real data is how you will learn the most in the shortest amount of time.

LikeLike

You are simply GREAT! A fine blend of unimaginable simplicity and quality! With regards,

LikeLike

Thank you!!!

LikeLike

regression trees are fit on the negative gradient of the binomial or multinomial deviance loss function. Binary classification is a special case where only a single regression tree is induced.

LikeLike