{"id":69,"date":"2019-01-19T17:32:30","date_gmt":"2019-01-20T00:32:30","guid":{"rendered":"http:\/\/codingrestart.com\/?p=69"},"modified":"2019-01-19T18:55:32","modified_gmt":"2019-01-20T01:55:32","slug":"ml-msdn-and-xavier","status":"publish","type":"post","link":"https:\/\/codingrestart.com\/home\/ml-msdn-and-xavier\/","title":{"rendered":"ML: MSDN and Xavier"},"content":{"rendered":"\n<p>Microsoft\u2019s MSDN magazine for its development community has been publishing quite a few introductory articles to Machine Learning (ML) over the past few months. January&#8217;s <a href=\"https:\/\/msdn.microsoft.com\/en-us\/magazine\/mt848698\">issue<\/a> emphasizes ML with another series of articles, albeit with differing qualities. I liked this quote from the editorial &#8220;<a href=\"https:\/\/msdn.microsoft.com\/en-us\/magazine\/mt848700\">Advancing AI<\/a>&#8220;: \u201dML is a huge graph that requires you to repeatedly examine topics, learning a bit more each time and understanding how topics are interrelated\u201d. An interesting article &#8220;<a href=\"https:\/\/msdn.microsoft.com\/en-us\/magazine\/mt848704\">Introduction to PyTorch on Windows&#8221;<\/a> is from James McCarthy and only highlights recent rumbles from Microsoft about switching from developing and using its own CNTK library to open source <a href=\"https:\/\/github.com\/pytorch\/pytorch\">PyTorch<\/a>, developed mainly by Facebook. In terms of activity, PyTorch statistics on GitHub are about half the activity for <a href=\"https:\/\/github.com\/tensorflow\/tensorflow\">TensorFlow<\/a> and both dwarf <a href=\"https:\/\/github.com\/Microsoft\/CNTK\">CNTK<\/a> numbers. From this perspective, Microsoft made the right choice abandoning CNTK. For an example of PyTorch staying current, latest version of <a href=\"https:\/\/pytorch.org\/blog\/the-road-to-1_0\/\">PyTorch introduces<\/a> two modes for Python-based ML that enables just in time JIT compilation to improve PyTorch&#8217;s adaptability to production environments. Another article &#8220;<a href=\"https:\/\/msdn.microsoft.com\/en-us\/magazine\/mt848708\">Self-Organizing Maps Using C#<\/a>&#8221; is about an ML technique that is not well known and its usability seems questionable. The third article &#8220;Leveraging the Beliefs-Desires-Intentions Agent Architecture&#8221; is poorly written and shamelessly plugs the author&#8217;s travel agency in the provided sample app.<\/p>\n\n\n\n<p>For most problems in ML, learning is achieved by applying non-linear activation functions to hypotheses. This allows the algorithms to discover non-linearity in data and predict unseen data with greater clarity. Trusted sigmoid activation functions are being replaced by other activation functions and also initialization is being improved. In fact, the &#8220;<a href=\"http:\/\/proceedings.mlr.press\/v9\/glorot10a\/glorot10a.pdf\">Understanding the difficulty of training deep feedforward neural networks<\/a>&#8221; study details replacing standard random initialization of Deep Learning (DL) networks with other initialization methods. DL networks are neural networks with high number of hidden layers. The study gets rather mathy half way through, but most of the article is digestible. The resulting initialization algorithm is now called Xavier initialization after the first author of the paper and is supported by all leading ML frameworks.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Microsoft\u2019s MSDN magazine for its development community has been publishing quite a few introductory articles to Machine Learning (ML) over the past few months. January&#8217;s issue emphasizes ML with another series of articles, albeit with differing qualities. I liked this quote from the editorial &#8220;Advancing AI&#8220;: \u201dML is a huge graph that requires you to &hellip; <a href=\"https:\/\/codingrestart.com\/home\/ml-msdn-and-xavier\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;ML: MSDN and Xavier&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[4],"tags":[],"class_list":["post-69","post","type-post","status-publish","format-standard","hentry","category-machine-learning"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/paAAlH-17","jetpack-related-posts":[{"id":51,"url":"https:\/\/codingrestart.com\/home\/really-old-geek\/","url_meta":{"origin":69,"position":0},"title":"Really Old Geek","author":"Viktor Sanek","date":"December 28, 2018","format":false,"excerpt":"The New York Times published an article about Donald Knuth \u201cThe Yoda of Silicon Valley\u201d. The article might idolize his work and impact, but provides a great overview of his work. Mr. Knuth is a living legend in the field of computer science, known mainly for his seminal book on\u2026","rel":"","context":"In &quot;C\/C++&quot;","block_context":{"text":"C\/C++","link":"https:\/\/codingrestart.com\/home\/category\/uncategorized\/cc\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":171,"url":"https:\/\/codingrestart.com\/home\/theory-behind-anki\/","url_meta":{"origin":69,"position":1},"title":"Theory Behind Anki","author":"Viktor Sanek","date":"April 23, 2019","format":false,"excerpt":"This post covers the theory behind Anki and follows the previous post introducing Anki. The last post in this series describes best practices for using Anki. One of the best meta-analyses reviewing best approaches to learning is Improving Students\u2019 Learning With Effective Learning Techniques by Dunlosky et al. If I\u2026","rel":"","context":"In &quot;Anki&quot;","block_context":{"text":"Anki","link":"https:\/\/codingrestart.com\/home\/category\/anki\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":180,"url":"https:\/\/codingrestart.com\/home\/anki-best-practices\/","url_meta":{"origin":69,"position":2},"title":"Anki: Best Practices","author":"Viktor Sanek","date":"April 28, 2019","format":false,"excerpt":"This post concludes the series of posts on Anki and follows the post introducing Anki and another one about the theory behind Anki. I have been using Anki for five years and achieved high consistency in completing daily reviews. Here are my best practices: Be selective about cards you create\u2026","rel":"","context":"In &quot;Anki&quot;","block_context":{"text":"Anki","link":"https:\/\/codingrestart.com\/home\/category\/anki\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":238,"url":"https:\/\/codingrestart.com\/home\/mona-lisas-video\/","url_meta":{"origin":69,"position":3},"title":"Mona Lisa&#8217;s video","author":"Viktor Sanek","date":"June 16, 2019","format":false,"excerpt":"For centuries, people have wondered about Mona Lisa's smile. Now they can stop wondering and just watch her videos. A group of AI researchers published a paper titled \"Few-Shot Adversarial Learning of Realistic Neural Talking Head Models\", where they describe a new algorithm to generate videos of peoples' heads (talking\u2026","rel":"","context":"In &quot;Machine Learning&quot;","block_context":{"text":"Machine Learning","link":"https:\/\/codingrestart.com\/home\/category\/machine-learning\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/Gans.png?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/Gans.png?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/Gans.png?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/Gans.png?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":233,"url":"https:\/\/codingrestart.com\/home\/dangers-of-nlp\/","url_meta":{"origin":69,"position":4},"title":"Dangers of NLP","author":"Viktor Sanek","date":"June 9, 2019","format":false,"excerpt":"Natural language processing (NLP) continues its rapid advance, leading some people to fear its latest results. The research organization OpenAI published a blog post titled \"Better Language Models and Their Implications\" summarizing its progress on \"predicting the next word, given all of the previous words within some text\". OpenAI calls\u2026","rel":"","context":"In &quot;Machine Learning&quot;","block_context":{"text":"Machine Learning","link":"https:\/\/codingrestart.com\/home\/category\/machine-learning\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/GPT-2.png?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/GPT-2.png?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/GPT-2.png?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/codingrestart.com\/wp-content\/uploads\/2019\/06\/GPT-2.png?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":1,"url":"https:\/\/codingrestart.com\/home\/hello-programmers\/","url_meta":{"origin":69,"position":5},"title":"Hello programmers!","author":"Viktor Sanek","date":"May 7, 2017","format":false,"excerpt":"Welcome to CodingRestart.com, blog for seasoned developers that want to upgrade their skills. If any of the following is true for you, this site is for you: You feel threatened by younger colleagues, because you do not know popular programming languages and buzzwords they use. You think Kernighan and Ritchie's\u2026","rel":"","context":"Similar post","block_context":{"text":"Similar post","link":""},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]}],"_links":{"self":[{"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/posts\/69","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/comments?post=69"}],"version-history":[{"count":4,"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/posts\/69\/revisions"}],"predecessor-version":[{"id":75,"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/posts\/69\/revisions\/75"}],"wp:attachment":[{"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/media?parent=69"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/categories?post=69"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/codingrestart.com\/home\/wp-json\/wp\/v2\/tags?post=69"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}