Network analysis algorithms give you your Google search results, Facebook news feed and Netflix and Hulu recommendations.
The resulting algorithms have proved invaluable for organizing, evaluating and utilizing information.
It's a rare day when our lives are not impacted by them.
And, in some respects, the repetitive nature of an algorithm is simply doing that.
Those shortcomings can have far-reaching consequences. Imperfect travel routes or poor romantic matching are frustrating and annoying.
Some algorithms “learn” from themselves by running repeatedly, analyzing the results, reapplying that analysis and compounding anew.
This process creates numerous models, which are then analyzed and tested as more data becomes available. We use algorithms to navigate our cars while listening to music selected by algorithms. Data collection, and our reliance on it, have evolved extremely rapidly. We invest our hard-earned dollars by means of algorithms.Wisconsin law now requires presentence investigation reports utilizing COMPAS to include written instructions to the sentencing judges saying risk scores may not be used to determine whether an offender is incarcerated, or to determine the severity of the sentence.Risk scores also cannot be used as the determinative factor in deciding whether the offender can be supervised safely in the community.Automated learning on inherently biased data leads to biased results.” For example, arrest rates filtered by ZIP code may contain racial bias.