<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
    <channel>
        <title>Optimization - Tag - Steven Purcell</title>
        <link>http://stevenpurcell.ninja/tags/optimization/</link>
        <description>Optimization - Tag - Steven Purcell</description>
        <generator>Hugo -- gohugo.io</generator><language>en</language><managingEditor>steven.ray.purcell@gmail.com (Steven Purcell)</managingEditor>
            <webMaster>steven.ray.purcell@gmail.com (Steven Purcell)</webMaster><lastBuildDate>Sun, 17 Dec 2023 14:56:54 -0500</lastBuildDate><atom:link href="http://stevenpurcell.ninja/tags/optimization/" rel="self" type="application/rss+xml" /><item>
    <title>Optimizing Gradient Boosting Models</title>
    <link>http://stevenpurcell.ninja/posts/optimizing_gradient_boosted_models/</link>
    <pubDate>Sun, 17 Dec 2023 14:56:54 -0500</pubDate>
    <author>Steven Purcell</author>
    <guid>http://stevenpurcell.ninja/posts/optimizing_gradient_boosted_models/</guid>
    <description><![CDATA[Gradient Boosting Models Gradient boosting classifier models are a powerful type of machine learning algorithm that outperform many other types of classifiers. In simplest terms, gradient boosting algorithms learn from the mistakes they make by optmizing on gradient descent. A gradient boosting model values the gradient descent, or the direction of the steepest increase of a function, to make adjustments so that the function can increase rapidly over each iteration. Gradient boosting models can be used for classfication or regression.]]></description>
</item>
</channel>
</rss>
