From 7abc2c3979199f6494516687eb7c59e51215e635 Mon Sep 17 00:00:00 2001 From: Sylvain Gugger Date: Thu, 14 May 2020 05:18:31 -0700 Subject: [PATCH] First batch of edits --- 01_intro.ipynb | 597 +++++++++++++++-------------- 02_production.ipynb | 384 ++++++++++--------- 03_ethics.ipynb | 387 +++++++++---------- 04_mnist_basics.ipynb | 682 +++++++++++++++++++--------------- 05_pet_breeds.ipynb | 338 ++++++++--------- 06_multicat.ipynb | 508 +++++-------------------- 07_sizing_and_tta.ipynb | 174 ++++----- 08_collab.ipynb | 26 +- 09_tabular.ipynb | 60 +-- 10_nlp.ipynb | 28 +- 11_midlevel_data.ipynb | 14 +- 12_nlp_dive.ipynb | 28 +- 13_convolutions.ipynb | 32 +- 14_resnet.ipynb | 12 +- 15_arch_details.ipynb | 12 +- 16_accel_sgd.ipynb | 14 +- 17_foundations.ipynb | 23 +- 18_CAM.ipynb | 6 +- 19_learner.ipynb | 6 +- 20_conclusion.ipynb | 2 +- app_blog.ipynb | 13 +- clean/01_intro.ipynb | 91 +++-- clean/02_production.ipynb | 69 ++-- clean/03_ethics.ipynb | 91 +++-- clean/04_mnist_basics.ipynb | 119 ++++-- clean/05_pet_breeds.ipynb | 60 +-- clean/06_multicat.ipynb | 346 +---------------- clean/07_sizing_and_tta.ipynb | 28 +- clean/08_collab.ipynb | 24 +- clean/09_tabular.ipynb | 58 +-- clean/10_nlp.ipynb | 28 +- clean/11_midlevel_data.ipynb | 14 +- clean/12_nlp_dive.ipynb | 28 +- clean/13_convolutions.ipynb | 32 +- clean/14_resnet.ipynb | 12 +- clean/15_arch_details.ipynb | 12 +- clean/16_accel_sgd.ipynb | 14 +- clean/17_foundations.ipynb | 27 +- clean/18_CAM.ipynb | 6 +- clean/19_learner.ipynb | 6 +- clean/20_conclusion.ipynb | 2 +- clean/app_blog.ipynb | 12 +- 42 files changed, 2010 insertions(+), 2415 deletions(-) diff --git a/01_intro.ipynb b/01_intro.ipynb index bca60d9cc..ab585608b 100644 --- a/01_intro.ipynb +++ b/01_intro.ipynb @@ -21,28 +21,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Your deep learning journey" + "# Your Deep Learning Journey" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Hello, and thank you for letting us join you on your deep learning journey, however far along that you may be! In this chapter, we will tell you a little bit more about what to expext in this book, introduce the key concepts behind deep learning and we will train our first models on different tasks. It doesn't matter if you don't come from a technical or a mathematical background (though that's okay too!), we wrote this book to put it in the hands of as many people as possible." + "Hello, and thank you for letting us join you on your deep learning journey, however far along that you may be! In this chapter, we will tell you a little bit more about what to expect in this book, introduce the key concepts behind deep learning, and train our first models on different tasks. It doesn't matter if you don't come from a technical or a mathematical background (though it's okay if you do too!); we wrote this book to make deep learning accessible to as many people as possible." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Deep learning is for everyone" + "## Deep Learning Is for Everyone" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "A lot of people assume that you need all kinds of hard-to-find stuff to get great results with deep learning, but, as you'll see in this book, those people are wrong. Here's a list of a few thing you **absolutely don't need** to do world-class deep learning:\n", + "A lot of people assume that you need all kinds of hard-to-find stuff to get great results with deep learning, but as you'll see in this book, those people are wrong. <> is a list of a few thing you *absolutely don't need* to do world-class deep learning.\n", "\n", "```asciidoc\n", "[[myths]]\n", @@ -56,26 +56,26 @@ "|======\n", "```\n", "\n", - "Deep learning is a computer technique to extract and transform data – with use cases ranging from human speech recognition to animal imagery classification – by using multiple layers of neural networks. Each of these layers takes its inputs from previous layers and progressively refines them. The layers are trained by algorithms that minimize their errors and improve their accuracy. In this way, the network learns to perform a specified task. We will discuss training algorithms in detail in the next section." + "Deep learning is a computer technique to extract and transform data–-with use cases ranging from human speech recognition to animal imagery classification–-by using multiple layers of neural networks. Each of these layers takes its inputs from previous layers and progressively refines them. The layers are trained by algorithms that minimize their errors and improve their accuracy. In this way, the network learns to perform a specified task. We will discuss training algorithms in detail in the next section." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Deep learning has power, flexibility, and simplicity. That's why we believe it should be applied across many disciplines. These include the social and physical sciences, the arts, medicine, finance, scientific research, and much more. To give a personal example, despite having no background in medicine, Jeremy started Enlitic, a company that uses deep learning algorithms to diagnose illness and disease. Within months of starting the company, it was announced that their algorithm could identify malignant tumors [more accurately than radiologists](https://www.nytimes.com/2016/02/29/technology/the-promise-of-artificial-intelligence-unfolds-in-small-steps.html).\n", + "Deep learning has power, flexibility, and simplicity. That's why we believe it should be applied across many disciplines. These include the social and physical sciences, the arts, medicine, finance, scientific research, and many more. To give a personal example, despite having no background in medicine, Jeremy started Enlitic, a company that uses deep learning algorithms to diagnose illness and disease. Within months of starting the company, it was announced that its algorithm could identify malignant tumors [more accurately than radiologists](https://www.nytimes.com/2016/02/29/technology/the-promise-of-artificial-intelligence-unfolds-in-small-steps.html).\n", "\n", - "Here's a list of some of the thousands of tasks where deep learning, or methods heavily using deep learning, is now the best in the world:\n", + "Here's a list of some of the thousands of tasks in different areas at which deep learning, or methods heavily using deep learning, is now the best in the world:\n", "\n", - "- Natural Language Processing (NLP):: answering questions; speech recognition; summarizing documents; classifying documents; finding names, dates, etc. in documents; searching for articles mentioning a concept\n", - "- Computer vision:: satellite and drone imagery interpretation (e.g. for disaster resilience); face recognition; image captioning; reading traffic signs; locating pedestrians and vehicles in autonomous vehicles\n", - "- Medicine:: Finding anomalies in radiology images, including CT, MRI, and X-ray; counting features in pathology slides; measuring features in ultrasounds; diagnosing diabetic retinopathy\n", - "- Biology:: folding proteins; classifying proteins; many genomics tasks, such as tumor-normal sequencing and classifying clinically actionable genetic mutations; cell classification; analyzing protein/protein interactions\n", + "- Natural language processing (NLP):: Answering questions; speech recognition; summarizing documents; classifying documents; finding names, dates, etc. in documents; searching for articles mentioning a concept\n", + "- Computer vision:: Satellite and drone imagery interpretation (e.g., for disaster resilience); face recognition; image captioning; reading traffic signs; locating pedestrians and vehicles in autonomous vehicles\n", + "- Medicine:: Finding anomalies in radiology images, including CT, MRI, and X-ray images; counting features in pathology slides; measuring features in ultrasounds; diagnosing diabetic retinopathy\n", + "- Biology:: Folding proteins; classifying proteins; many genomics tasks, such as tumor-normal sequencing and classifying clinically actionable genetic mutations; cell classification; analyzing protein/protein interactions\n", "- Image generation:: Colorizing images; increasing image resolution; removing noise from images; converting images to art in the style of famous artists\n", - "- Recommendation systems:: web search; product recommendations; home page layout\n", - "- Playing games:: Better than humans and better than any other computer algorithm at Chess, Go, most Atari videogames, and many real-time strategy games\n", - "- Robotics:: handling objects that are challenging to locate (e.g. transparent, shiny, lack of texture) or hard to pick up\n", - "- Other applications:: financial and logistical forecasting; text to speech; much much more..." + "- Recommendation systems:: Web search; product recommendations; home page layout\n", + "- Playing games:: Chess, Go, most Atari video games, and many real-time strategy games\n", + "- Robotics:: Handling objects that are challenging to locate (e.g., transparent, shiny, lacking texture) or hard to pick up\n", + "- Other applications:: Financial and logistical forecasting, text to speech, and much more..." ] }, { @@ -91,23 +91,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Neural networks: a brief history" + "## Neural Networks: A Brief History" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In 1943 Warren McCulloch, a neurophysiologist, and Walter Pitts, a logician, teamed up to develop a mathematical model of an artificial neuron. They declared that:\n", + "In 1943 Warren McCulloch, a neurophysiologist, and Walter Pitts, a logician, teamed up to develop a mathematical model of an artificial neuron. In their [paper](https://link.springer.com/article/10.1007/BF02478259) \"A Logical Calculus of the Ideas Immanent in Nervous Activity\" they declared that:\n", "\n", - "> : _Because of the “all-or-none” character of nervous activity, neural events and the relations among them can be treated by means of propositional logic. It is found that the behavior of every net can be described in these terms_. (Pitts and McCulloch; A Logical Calculus of the Ideas Immanent in Nervous Activity)" + "> : Because of the “all-or-none” character of nervous activity, neural events and the relations among them can be treated by means of propositional logic. It is found that the behavior of every net can be described in these terms." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "They realised that a simplified model of a real neuron could be represented using simple addition and thresholding as shown in <>. Pitts was self-taught, and, by age 12, had received an offer to study at Cambridge with the great Bertrand Russell. He did not take up this invitation, and indeed throughout his life did not accept any offers of advanced degrees or positions of authority. Most of his famous work was done whilst he was homeless. Despite his lack of an officially recognized position and increasing social isolation, his work with McCulloch was influential, and was taken up by a psychologist named Frank Rosenblatt." + "McCulloch and Pitts realized that a simplified model of a real neuron could be represented using simple addition and thresholding, as shown in <>. Pitts was self-taught, and by age 12, had received an offer to study at Cambridge University with the great Bertrand Russell. He did not take up this invitation, and indeed throughout his life did not accept any offers of advanced degrees or positions of authority. Most of his famous work was done while he was homeless. Despite his lack of an officially recognized position and increasing social isolation, his work with McCulloch was influential, and was taken up by a psychologist named Frank Rosenblatt." ] }, { @@ -121,23 +121,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Rosenblatt further developed the artificial neuron to give it the ability to learn. Even more importantly, he worked on building the first device that actually used these principles, The Mark I Perceptron. Rosenblatt wrote about this work: \"We are about to witness the birth of such a machine – a machine capable of perceiving, recognizing and identifying its surroundings without any human training or control\". The perceptron was built, and was able to successfully recognize simple shapes.\n", + "Rosenblatt further developed the artificial neuron to give it the ability to learn. Even more importantly, he worked on building the first device that actually used these principles, the Mark I Perceptron. In \"The Design of an Intelligent Automaton\" Rosenblatt wrote about this work: \"We are now about to witness the birth of such a machine–-a machine capable of perceiving, recognizing and identifying its surroundings without any human training or control.\" The perceptron was built, and was able to successfully recognize simple shapes.\n", "\n", - "An MIT professor named Marvin Minsky (who was a grade behind Rosenblatt at the same high school!) along with Seymour Papert wrote a book, called \"Perceptrons\", about Rosenblatt's invention. They showed that a single layer of these devices was unable to learn some simple, critical mathematical functions (such as XOR). In the same book, they also showed that using multiple layers of the devices would allow these limitations to be addressed. Unfortunately, only the first of these insights was widely recognized. As a result, the global academic community nearly entirely gave up on neural networks for the next two decades." + "An MIT professor named Marvin Minsky (who was a grade behind Rosenblatt at the same high school!), along with Seymour Papert, wrote a book called _Perceptrons_ (MIT Press), about Rosenblatt's invention. They showed that a single layer of these devices was unable to learn some simple but critical mathematical functions (such as XOR). In the same book, they also showed that using multiple layers of the devices would allow these limitations to be addressed. Unfortunately, only the first of these insights was widely recognized. As a result, the global academic community nearly entirely gave up on neural networks for the next two decades." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Perhaps the most pivotal work in neural networks in the last 50 years is the multi-volume *Parallel Distributed Processing* (PDP), released in 1986 by MIT Press. Chapter 1 lays out a similar hope to that shown by Rosenblatt:\n", + "Perhaps the most pivotal work in neural networks in the last 50 years was the multi-volume *Parallel Distributed Processing* (PDP) by David Rumelhart, James McClellan, and the PDP Research Group, released in 1986 by MIT Press. Chapter 1 lays out a similar hope to that shown by Rosenblatt:\n", "\n", - "> : _…people are smarter than today's computers because the brain employs a basic computational architecture that is more suited to deal with a central aspect of the natural information processing tasks that people are so good at. …we will introduce a computational framework for modeling cognitive processes that seems… closer than other frameworks to the style of computation as it might be done by the brain._ (PDP, chapter 1)\n", + "> : People are smarter than today's computers because the brain employs a basic computational architecture that is more suited to deal with a central aspect of the natural information processing tasks that people are so good at. ...We will introduce a computational framework for modeling cognitive processes that seems… closer than other frameworks to the style of computation as it might be done by the brain.\n", "\n", - "The premise that PDP is using here is that traditional computer programs work very differently to brains, and that might be why computer programs had benn (at that point) so bad at doing things that brains find easy (such as recognizing objects in pictures). The authors claim that the PDP approach is \"closer \n", + "The premise that PDP is using here is that traditional computer programs work very differently to brains, and that might be why computer programs had been (at that point) so bad at doing things that brains find easy (such as recognizing objects in pictures). The authors claimed that the PDP approach was \"closer \n", "than other frameworks\" to how the brain works, and therefore it might be better able to handle these kinds of tasks.\n", "\n", - "In fact, the approach laid out in PDP is very similar to the approach used in today's neural networks. The book defined \"Parallel Distributed Processing\" as requiring:\n", + "In fact, the approach laid out in PDP is very similar to the approach used in today's neural networks. The book defined parallel distributed processing as requiring:\n", "\n", "1. A set of *processing units*\n", "1. A *state of activation*\n", @@ -150,18 +150,18 @@ "\n", "We will see in this book that modern neural networks handle each of these requirements.\n", "\n", - "In the 1980's most models were built with a second layer of neurons, thus avoiding the problem that had been identified by Minsky (this was their \"pattern of connectivity among units\", to use the framework above). And indeed, neural networks were widely used during the 80s and 90s for real, practical projects. However, again a misunderstanding of the theoretical issues held back the field. In theory, adding just one extra layer of neurons was enough to allow any mathematical function to be approximated with these neural networks, but in practice such networks were often too big and too slow to be useful.\n", + "In the 1980's most models were built with a second layer of neurons, thus avoiding the problem that had been identified by Minsky and Papert (this was their \"pattern of connectivity among units,\" to use the framework above). And indeed, neural networks were widely used during the '80s and '90s for real, practical projects. However, again a misunderstanding of the theoretical issues held back the field. In theory, adding just one extra layer of neurons was enough to allow any mathematical function to be approximated with these neural networks, but in practice such networks were often too big and too slow to be useful.\n", "\n", - "Although researchers showed 30 years ago that to get practical good performance you need to use even more layers of neurons, it is only in the last decade that this principle has been more widely appreciatedand applied. Neural networks are now finally living up to their potential, thanks to the use of more layers, coupled with the capacity to do so due to improvements in computer hardware, increases in data availability, and algorithmic tweaks that allow neural networks to be trained faster and more easily. We now have what Rosenblatt had promised: \"a machine capable of perceiving, recognizing and identifying its surroundings without any human training or control\".\n", + "Although researchers showed 30 years ago that to get practical good performance you need to use even more layers of neurons, it is only in the last decade that this principle has been more widely appreciated and applied. Neural networks are now finally living up to their potential, thanks to the use of more layers, coupled with the capacity to do so due to improvements in computer hardware, increases in data availability, and algorithmic tweaks that allow neural networks to be trained faster and more easily. We now have what Rosenblatt promised: \"a machine capable of perceiving, recognizing, and identifying its surroundings without any human training or control.\"\n", "\n", - "This is what you will learn how to build in this book. Since we are going to be spending a lot of time together, let's get to know each other a bit… " + "This is what you will learn how to build in this book. But first, since we are going to be spending a lot of time together, let's get to know each other a bit… " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Who we are" + "## Who We Are" ] }, { @@ -170,7 +170,7 @@ "source": [ "We are Sylvain and Jeremy, your guides on this journey. We hope that you will find us well suited for this position.\n", "\n", - "Jeremy has been using and teaching machine learning for around 30 years. He started using neural networks 25 years ago. During this time, he has led many companies and projects which have machine learning at their core, including founding the first company to focus on deep learning and medicine, Enlitic, and taking on the role of President and Chief Scientist of the world's largest machine learning community, Kaggle. He is the co-founder, along with Dr Rachel Thomas, of fast.ai, the organisation that built the course this book is based on.\n", + "Jeremy has been using and teaching machine learning for around 30 years. He started using neural networks 25 years ago. During this time, he has led many companies and projects that have machine learning at their core, including founding the first company to focus on deep learning and medicine, Enlitic, and taking on the role of President and Chief Scientist of the world's largest machine learning community, Kaggle. He is the co-founder, along with Dr. Rachel Thomas, of fast.ai, the organization that built the course this book is based on.\n", "\n", "From time to time you will hear directly from us, in sidebars like this one from Jeremy:" ] @@ -179,32 +179,32 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> J: Hi everybody, I'm Jeremy! You might be interested to know that I do not have any formal technical education. I completed a Bachelor of Arts, with a major in philosophy, and didn't do very well in my university grades. I was much more interested in doing real projects, rather than theoretical studies, so I worked full-time at a management consulting firm called McKinsey and Company throughout my degree. If you're somebody who would rather get their hands dirty building stuff than spend years learning abstract concepts, then you will understand where I am coming from! Look out for sidebars from me to find information most suited to people with a less mathematical or formal technical background—that is, people like me…" + "> J: Hi everybody, I'm Jeremy! You might be interested to know that I do not have any formal technical education. I completed a BA, with a major in philosophy, and didn't have great grades. I was much more interested in doing real projects, rather than theoretical studies, so I worked full time at a management consulting firm called McKinsey and Company throughout my university years. If you're somebody who would rather get their hands dirty building stuff than spend years learning abstract concepts, then you will understand where I am coming from! Look out for sidebars from me to find information most suited to people with a less mathematical or formal technical background—that is, people like me…" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Sylvain, on the other hand, knows a lot about formal technical education. In fact, he has written 10 maths textbooks, covering the entire advanced French maths curriculum!" + "Sylvain, on the other hand, knows a lot about formal technical education. In fact, he has written 10 math textbooks, covering the entire advanced French maths curriculum!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> S: Unlike Jeremy, I have not spent many years coding and applying machine learning algorithms. Rather, I recently came to the machine learning world, by watching Jeremy's fast.ai course videos. So, if you are somebody who has not opened a terminal and written commands at the command line, then you will understand where I am coming from! Look out for sidebars from me to find information most suited to people with a more mathematical or formal technical background, but less real-world coding—that is, people like me…" + "> S: Unlike Jeremy, I have not spent many years coding and applying machine learning algorithms. Rather, I recently came to the machine learning world, by watching Jeremy's fast.ai course videos. So, if you are somebody who has not opened a terminal and written commands at the command line, then you will understand where I am coming from! Look out for sidebars from me to find information most suited to people with a more mathematical or formal technical background, but less real-world coding experience—that is, people like me…" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The fast.ai course has been studied by hundreds of thousands of students, from all walks of life, from all parts of the world. Sylvain stood out as the most impressive student of the course that Jeremy had ever seen, which led to him joining fast.ai, and then becoming the co-author, along with Jeremy, of the fastai software library.\n", + "The fast.ai course has been studied by hundreds of thousands of students, from all walks of life, from all parts of the world. Sylvain stood out as the most impressive student of the course that Jeremy had ever seen, which led to him joining fast.ai, and then becoming the coauthor, along with Jeremy, of the fastai software library.\n", "\n", - "All this means that you have the best of both worlds: the people who know more about the software than anybody, because they wrote it, an expert on maths, and an expert on coding and machine learning, but also people who understand what it feels like to be a relative outsider in maths, and a relative outsider in coding and machine learning.\n", + "All this means that between us you have the best of both worlds: the people who know more about the software than anybody else, because they wrote it; an expert on math, and an expert on coding and machine learning; and also people who understand both what it feels like to be a relative outsider in math, and a relative outsider in coding and machine learning.\n", "\n", - "Anybody who has watched sports knows that if you have a two-person commentary team then you also need a third person to do \"special comments\". Our special commentator is Alexis Gallagher. Alexis has a very diverse background: he has been a researcher in mathematical biology, a screenplay writer, an improv performer, a McKinsey consultant (like Jeremy!), a Swift coder, and a CTO." + "Anybody who has watched sports knows that if you have a two-person commentary team then you also need a third person to do \"special comments.\" Our special commentator is Alexis Gallagher. Alexis has a very diverse background: he has been a researcher in mathematical biology, a screenplay writer, an improv performer, a McKinsey consultant (like Jeremy!), a Swift coder, and a CTO." ] }, { @@ -218,70 +218,70 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## How to learn deep learning" + "## How to Learn Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Harvard professor David Perkins, who wrote Making Learning Whole, has much to say about teaching. The basic idea is to teach the *whole game*. That means that if you're teaching baseball, you first take people to a baseball game or get them to play it. You don't teach them how to line thread into a ball, the physics of a parabola, or the coefficient of friction of a ball on a bat.\n", + "Harvard professor David Perkins, who wrote _Making Learning Whole_ (Jossey-Bass), has much to say about teaching. The basic idea is to teach the *whole game*. That means that if you're teaching baseball, you first take people to a baseball game or get them to play it. You don't teach them how to line thread into a ball, the physics of a parabola, or the coefficient of friction of a ball on a bat.\n", "\n", - "Paul Lockhart, a Columbia math PhD, former Brown professor, and K-12 math teacher, imagines in the influential essay A Mathematician's Lament a nightmare world where music and art are taught the way math is taught. Children would not be allowed to listen to or play music until they have spent over a decade mastering music notation and theory, spending classes transposing sheet music into a different key. In art class, students study colours and applicators, but aren't allowed to actually paint until college. Sound absurd? This is how math is taught–we require students to spend years doing rote memorization, and learning dry, disconnected *fundamentals* that we claim will pay off later, long after most of them quit the subject.\n", + "Paul Lockhart, a Columbia math PhD, former Brown professor, and K-12 math teacher, imagines in the influential [essay](https://www.maa.org/external_archive/devlin/LockhartsLament.pdf) \"A Mathematician's Lament\" a nightmare world where music and art are taught the way math is taught. Children are allowed to listen to or play music until they have spent over a decade mastering music notation and theory, spending classes transposing sheet music into a different key. In art class, students study colors and applicators, but aren't allowed to actually paint until college. Sound absurd? This is how math is taught–-we require students to spend years doing rote memorization and learning dry, disconnected *fundamentals* that we claim will pay off later, long after most of them quit the subject.\n", "\n", - "Unfortunately, this is where many teaching resources on deep learning begin–asking learners to follow along with the definition of the Hessian and theorems for the Taylor approximation of your loss functions, without ever giving examples of actual working code. We're not knocking calculus. We love calculus and have even taught it at the college level, but we don't think it's the best place to start when learning deep learning!\n", + "Unfortunately, this is where many teaching resources on deep learning begin–-asking learners to follow along with the definition of the Hessian and theorems for the Taylor approximation of your loss functions, without ever giving examples of actual working code. We're not knocking calculus. We love calculus, and Sylvain has even taught it at the college level, but we don't think it's the best place to start when learning deep learning!\n", "\n", - "In deep learning, it really helps if you have the motivation to fix your model to get it to do better. That's when you start learning the relevant theory. But you need to have the model in the first place. We teach almost everything through real examples. As we build out those examples, we go deeper and deeper, and we'll show you how to make your projects better and better. This means that you'll be gradually learning all the theoretical foundations you need, in context, in a way that you'll see why it matters and how it works.\n", + "In deep learning, it really helps if you have the motivation to fix your model to get it to do better. That's when you start learning the relevant theory. But you need to have the model in the first place. We teach almost everything through real examples. As we build out those examples, we go deeper and deeper, and we'll show you how to make your projects better and better. This means that you'll be gradually learning all the theoretical foundations you need, in context, in such a way that you'll see why it matters and how it works.\n", "\n", "So, here's our commitment to you. Throughout this book, we will follow these principles:\n", "\n", - "- Teaching the *whole game* – starting off by showing how to use a complete, working, very usable, state of the art deep learning network to solve real world problems, by using simple, expressive tools. And then gradually digging deeper and deeper into understanding how those tools are made, and how the tools that make those tools are made, and so on…\n", - "- Always teaching through examples: ensuring that there is a context and a purpose that you can understand intuitively, rather than starting with algebraic symbol manipulation ;\n", - "- Simplifying as much as possible: we've spent years building tools and teaching methods that make previously complex topics very simple ;\n", - "- Removing barriers: deep learning has, until now, been a very exclusive game. We're breaking it open, and ensuring that everyone can play." + "- Teaching the *whole game*. We'll start by showing how to use a complete, working, very usable, state-of-the-art deep learning network to solve real-world problems, using simple, expressive tools. And then we'll gradually dig deeper and deeper into understanding how those tools are made, and how the tools that make those tools are made, and so on…\n", + "- Always teaching through examples. We'll ensure that there is a context and a purpose that you can understand intuitively, rather than starting with algebraic symbol manipulation.\n", + "- Simplifying as much as possible. We've spent years building tools and teaching methods that make previously complex topics very simple.\n", + "- Removing barriers. Deep learning has, until now, been a very exclusive game. We're breaking it open, and ensuring that everyone can play." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The hardest part of deep learning is artisanal: how do you know if you've got enough data; whether it is in the right format; if your model is training properly; and if it's not, what should you do about it? That is why we believe in learning by doing. As with basic data science skills, with deep learning you only get better through practical experience. Trying to spend too much time on the theory can be counterproductive. The key is to just code and try to solve problems: the theory can come later, when you have context and motivation.\n", + "The hardest part of deep learning is artisanal: how do you know if you've got enough data, whether it is in the right format, if your model is training properly, and, if it's not, what you should do about it? That is why we believe in learning by doing. As with basic data science skills, with deep learning you only get better through practical experience. Trying to spend too much time on the theory can be counterproductive. The key is to just code and try to solve problems: the theory can come later, when you have context and motivation.\n", "\n", - "There will be times when the journey will feel hard. Times where you feel stuck. Don't give up! Rewind through the book to find the last bit where you definitely weren't stuck, and then read slowly through from there to find the first thing that isn't clear. Then try some code experiments yourself, and Google around for more tutorials on whatever the issue you're stuck with is--often you'll find some different angle on the material which might help it to click. Also, it's expected and normal to not understand everything (especially the code) on first reading. Trying to understand the material serially before proceeding can sometimes be hard. Sometimes things click into place after you get more context from parts down the road, from having a bigger picture. So if you do get stuck on a section, try moving on anyway and make a note to come back to it later.\n", + "There will be times when the journey will feel hard. Times where you feel stuck. Don't give up! Rewind through the book to find the last bit where you definitely weren't stuck, and then read slowly through from there to find the first thing that isn't clear. Then try some code experiments yourself, and Google around for more tutorials on whatever the issue you're stuck with is--often you'll find some different angle on the material might help it to click. Also, it's expected and normal to not understand everything (especially the code) on first reading. Trying to understand the material serially before proceeding can sometimes be hard. Sometimes things click into place after you get more context from parts down the road, from having a bigger picture. So if you do get stuck on a section, try moving on anyway and make a note to come back to it later.\n", "\n", - "Remember, you don't need any particular academic background to succeed at deep learning. Many important breakthroughs are made in research and industry by folks without a PhD, such as the paper [Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks](https://arxiv.org/abs/1511.06434), one of the most influential papers of the last decade, with over 5000 citations, which was written by Alec Radford when he was an under-graduate. Even at Tesla, where they're trying to solve the extremely tough challenge of making a self-driving car, CEO [Elon Musk says](https://twitter.com/elonmusk/status/1224089444963311616):\n", + "Remember, you don't need any particular academic background to succeed at deep learning. Many important breakthroughs are made in research and industry by folks without a PhD, such as [\"Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks\"](https://arxiv.org/abs/1511.06434)--one of the most influential papers of the last decade--with over 5,000 citations, which was written by Alec Radford when he was an undergraduate. Even at Tesla, where they're trying to solve the extremely tough challenge of making a self-driving car, CEO [Elon Musk says](https://twitter.com/elonmusk/status/1224089444963311616):\n", "\n", - "> : \"A PhD is definitely not required. All that matters is a deep understanding of AI & ability to implement NNs in a way that is actually useful (latter point is what’s truly hard). Don’t care if you even graduated high school.\"" + "> : A PhD is definitely not required. All that matters is a deep understanding of AI & ability to implement NNs in a way that is actually useful (latter point is what’s truly hard). Don’t care if you even graduated high school." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "What you will need to succeed however is to apply what you learn in this book to a personal project and always persevere." + "What you will need to do to succeed however is to apply what you learn in this book to a personal project, and always persevere." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Your projects and your mindset" + "### Your Projects and Your Mindset" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Whether you're excited to identify if plants are diseased from pictures of their leaves, auto-generate knitting patterns, diagnose TB from x-rays, or determine when a raccoon is using your cat door, we will get you using deep learning on your own problems (via pre-trained models from others) as quickly as possible, and then will progressively drill into more details. You'll learn how to use deep learning to solve your own problems at state-of-the-art accuracy within the first 30 minutes of the next chapter! (And feel free to skip straight there now if you're dying to get coding right away.) There is a pernicious myth out there that you need to have computing resources and datasets the size of those at Google to be able to do deep learning, and it's not true.\n", + "Whether you're excited to identify if plants are diseased from pictures of their leaves, auto-generate knitting patterns, diagnose TB from X-rays, or determine when a raccoon is using your cat door, we will get you using deep learning on your own problems (via pre-trained models from others) as quickly as possible, and then will progressively drill into more details. You'll learn how to use deep learning to solve your own problems at state-of-the-art accuracy within the first 30 minutes of the next chapter! (And feel free to skip straight there now if you're dying to get coding right away.) There is a pernicious myth out there that you need to have computing resources and datasets the size of those at Google to be able to do deep learning, but it's not true.\n", "\n", - "So, what sort of tasks make for good test cases? You could train your model to distinguish between Picasso and Monet paintings or to pick out pictures of your daughter instead of pictures of your son. It helps to focus on your hobbies and passions–setting yourself four of five little projects rather than striving to solve a big, grand problem tends to work better when you're getting started. Since it is easy to get stuck, trying to be too ambitious too early can often backfire. Then, once you've got the basics mastered, aim to complete something you're really proud of!" + "So, what sorts of tasks make for good test cases? You could train your model to distinguish between Picasso and Monet paintings or to pick out pictures of your daughter instead of pictures of your son. It helps to focus on your hobbies and passions–-setting yourself four or five little projects rather than striving to solve a big, grand problem tends to work better when you're getting started. Since it is easy to get stuck, trying to be too ambitious too early can often backfire. Then, once you've got the basics mastered, aim to complete something you're really proud of!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> J: Deep learning can be set to work on almost any problem. For instance, my first startup was a company called FastMail, which provided enhanced email services when it launched in 1999 (and still does to this day). In 2002 I set it up to use a primitive form of deep learning – single-layer neural networks – to help categorise emails and stop customers from receiving spam." + "> J: Deep learning can be set to work on almost any problem. For instance, my first startup was a company called FastMail, which provided enhanced email services when it launched in 1999 (and still does to this day). In 2002 I set it up to use a primitive form of deep learning, single-layer neural networks, to help categorize emails and stop customers from receiving spam." ] }, { @@ -302,14 +302,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The software: PyTorch, fastai, and Jupyter" + "## The Software: PyTorch, fastai, and Jupyter" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "(and why it doesn't matter)" + "(And Why It Doesn't Matter)" ] }, { @@ -318,25 +318,25 @@ "source": [ "We've completed hundreds of machine learning projects using dozens of different packages, and many different programming languages. At fast.ai, we have written courses using most of the main deep learning and machine learning packages used today. After PyTorch came out in 2017 we spent over a thousand hours testing it before deciding that we would use it for future courses, software development, and research. Since that time PyTorch has become the world's fastest-growing deep learning library and is already used for most research papers at top conferences. This is generally a leading indicator of usage in industry, because these are the papers that end up getting used in products and services commercially. We have found that PyTorch is the most flexible and expressive library for deep learning. It does not trade off speed for simplicity, but provides both.\n", "\n", - "PyTorch works best as a low-level foundation library, providing the basic operations for higher level functionality. The fastai library is the most popular library for adding this higher-level functionality on top of PyTorch. It's also particularly well suited for the purposes of this book, because it is unique in providing a deeply layered software architecture (there's even a [peer-reviewed academic paper](https://arxiv.org/abs/2002.04688) about this layered API). In this book, as we go deeper and deeper into the foundations of deep learning, we will also go deeper and deeper into the layers of fastai. This book covers version 2 of the fastai library, which is a from-scratch rewrite providing many unique features." + "PyTorch works best as a low-level foundation library, providing the basic operations for higher-level functionality. The fastai library is the most popular library for adding this higher-level functionality on top of PyTorch. It's also particularly well suited to the purposes of this book, because it is unique in providing a deeply layered software architecture (there's even a [peer-reviewed academic paper](https://arxiv.org/abs/2002.04688) about this layered API). In this book, as we go deeper and deeper into the foundations of deep learning, we will also go deeper and deeper into the layers of fastai. This book covers version 2 of the fastai library, which is a from-scratch rewrite providing many unique features." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "However, it doesn't really matter what software you learn, because it takes only a few days to learn to switch from one library to another. What really matters is learning the deep learning foundations and techniques properly. Our focus will be on using code which as clearly as possible expresses the concepts that you need to learn. Where we are teaching high-level concepts, we will use high level fastai code. Where we are teaching low-level concepts, we will use low-level PyTorch, or even pure Python code.\n", + "However, it doesn't really matter what software you learn, because it takes only a few days to learn to switch from one library to another. What really matters is learning the deep learning foundations and techniques properly. Our focus will be on using code that as clearly as possibly expresses the concepts that you need to learn. Where we are teaching high-level concepts, we will use high-level fastai code. Where we are teaching low-level concepts, we will use low-level PyTorch, or even pure Python code.\n", "\n", - "If it feels like new deep learning libraries are appearing at a rapid pace nowadays, then you need to be prepared for a much faster rate of change in the coming months and years. As more people enter the field, they will bring more skills and ideas, and try more things. You should assume that whatever specific libraries and software you learn today will be obsolete in a year or two. Just think about the number of changes of libraries and technology stacks that occur all the time in the world of web programming — and yet this is a much more mature and slow-growing area than deep learning. We strongly believe that the focus in learning needs to be on understanding the underlying techniques and how to apply them in practice, and how to quickly build expertise in new tools and techniques as they are released." + "If it feels like new deep learning libraries are appearing at a rapid pace nowadays, then you need to be prepared for a much faster rate of change in the coming months and years. As more people enter the field, they will bring more skills and ideas, and try more things. You should assume that whatever specific libraries and software you learn today will be obsolete in a year or two. Just think about the number of changes in libraries and technology stacks that occur all the time in the world of web programming—a much more mature and slow-growing area than deep learning. We strongly believe that the focus in learning needs to be on understanding the underlying techniques and how to apply them in practice, and how to quickly build expertise in new tools and techniques as they are released." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "By the end of the book, you'll understand nearly all the code that's inside fastai (and much of PyTorch too), because each chapter we'll be digging a level deeper to understand exactly what's going on as we build and train our models. This means that you'll have learnt the most important best practices used in modern deep learning—not just how to use them, but how they really work and are implemented. If you want to use those approaches in another framework, you'll have the knowledge you need to develop it if needed.\n", + "By the end of the book, you'll understand nearly all the code that's inside fastai (and much of PyTorch too), because in each chapter we'll be digging a level deeper to show you exactly what's going on as we build and train our models. This means that you'll have learned the most important best practices used in modern deep learning—not just how to use them, but how they really work and are implemented. If you want to use those approaches in another framework, you'll have the knowledge you need to do so if needed.\n", "\n", - "Since the most important thing for learning deep learning is writing code and experimenting, it's important that you have a great platform for experimenting with code. The most popular programming experimentation platform is called Jupyter. This is what we will be using throughout this book. We will show you how you can use Jupyter to train and experiment with models and introspect every stage of the data pre-processing and model development pipeline. Jupyter is the most popular tool for doing data science in Python, for good reason. It is powerful, flexible, and easy to use. We think you will love it!" + "Since the most important thing for learning deep learning is writing code and experimenting, it's important that you have a great platform for experimenting with code. The most popular programming experimentation platform is called Jupyter. This is what we will be using throughout this book. We will show you how you can use Jupyter to train and experiment with models and introspect every stage of the data pre-processing and model development pipeline. [Jupyter Notebook](https://jupyter.org/) is the most popular tool for doing data science in Python, for good reason. It is powerful, flexible, and easy to use. We think you will love it!" ] }, { @@ -350,28 +350,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Your first model" + "## Your First Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "As we said before, we will teach how to do things before we explain why they work. Following this top-down approach, we will begin by actually training an image classifier to recognize dogs and cats with almost 100% accuracy. To train this model and run our experiments, you will need some initial setup. Don't worry, it's not as hard as it looks." + "As we said before, we will teach you how to do things before we explain why they work. Following this top-down approach, we will begin by actually training an image classifier to recognize dogs and cats with almost 100% accuracy. To train this model and run our experiments, you will need to do some initial setup. Don't worry, it's not as hard as it looks." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> s: Do not skip the setup part even if it looks intimidating at first, especially if you have little or no experience using things like a terminal or the command line. Most of that is actually not necessary and you will find that the easiest servers can be setup with just your usual web browser. It is crucial that you run your own experiments in parallel with this book in order to learn." + "> s: Do not skip the setup part even if it looks intimidating at first, especially if you have little or no experience using things like a terminal or the command line. Most of that is actually not necessary and you will find that the easiest servers can be set up with just your usual web browser. It is crucial that you run your own experiments in parallel with this book in order to learn." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Getting a GPU deep learning server" + "### Getting a GPU Deep Learning Server" ] }, { @@ -385,25 +385,25 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: (Graphic Processing Unit) GPU: Also known as a *graphics card*. A special kind of processor in your computer than can handle thousands of single tasks at the same time, especially designed for displaying 3D environments on a computer for playing games. These same basic tasks are very similar to what neural networks do, such that GPUs can run neural networks hundreds of times faster than regular CPUs. All modern computers contain a GPU, but few contain the right kind of GPU necessary for deep learning." + "> jargon: Graphics Processing Unit (GPU): Also known as a _graphics card_. A special kind of processor in your computer that can handle thousands of single tasks at the same time, especially designed for displaying 3D environments on a computer for playing games. These same basic tasks are very similar to what neural networks do, such that GPUs can run neural networks hundreds of times faster than regular CPUs. All modern computers contain a GPU, but few contain the right kind of GPU necessary for deep learning." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The best choice for GPU servers for use with this book changes over time, as companies come and go, and prices change. We keep a list of our recommended options on the [book website](https://book.fast.ai/). So, go there now, and follow the instructions to get connected to a GPU deep learning server. Don't worry, it only takes about two minutes to get set up on most platforms, and many don't even require any payment, or even a credit card to get started.\n", + "The best choice of GPU servers to use with this book will change over time, as companies come and go and prices change. We maintain a list of our recommended options on the [book's website](https://book.fast.ai/), so go there now and follow the instructions to get connected to a GPU deep learning server. Don't worry, it only takes about two minutes to get set up on most platforms, and many don't even require any payment, or even a credit card, to get started.\n", "\n", - "> A: My two cents: heed this advice! If you like computers you will be tempted to setup your own box. Beware! It is feasible but surprisingly involved and distracting. There is a good reason this book is not titled, _Everything you ever wanted to know about Ubuntu system administration, NVIDIA driver installation, apt-get, conda, pip, and Jupyter notebook configuration_. That would be a book of its own. Having designed and deployed our production machine learning infrastructure at work, I can testify it has its satisfactions but it is as unrelated to modelling as maintaining an airplane is to flying one.\n", + "> A: My two cents: heed this advice! If you like computers you will be tempted to set up your own box. Beware! It is feasible but surprisingly involved and distracting. There is a good reason this book is not titled, _Everything You Ever Wanted to Know About Ubuntu System Administration, NVIDIA Driver Installation, apt-get, conda, pip, and Jupyter Notebook Configuration_. That would be a book of its own. Having designed and deployed our production machine learning infrastructure at work, I can testify it has its satisfactions, but it is as unrelated to modeling as maintaining an airplane is to flying one.\n", "\n", - "Each option shown on the book website includes a tutorial; after completing the tutorial, you will end up with a screen looking like <>." + "Each option shown on the website includes a tutorial; after completing the tutorial, you will end up with a screen looking like <>." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Initial" + "\"Initial" ] }, { @@ -417,35 +417,35 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: Jupyter Notebook: A piece of software that allows you to include formatted text, code, images, videos, and much more, all within a single interactive document. Jupyter received the highest honor for software, the ACM Software System Award, thanks to its wide use and enormous impact in many academic fields, and in industry. Jupyter Notebook is the most widely used software by data scientists for developing and interacting with deep learning models." + "> jargon: Jupyter Notebook: A piece of software that allows you to include formatted text, code, images, videos, and much more, all within a single interactive document. Jupyter received the highest honor for software, the ACM Software System Award, thanks to its wide use and enormous impact in many academic fields and in industry. Jupyter Notebook is the software most widely used by data scientists for developing and interacting with deep learning models." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Running your first notebook" + "### Running Your First Notebook" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The notebooks are labelled by chapter, and then by notebook number, so that they are in the same order as they are presented in this book. So, the very first notebook you will see listed, is the notebook that we need to use now. You will be using this notebook to train a model that can recognize dog and cat photos. To do this, we'll be downloading a _dataset_ of dog and cat photos, and using that to _train a model_. A _dataset_ simply refers to a bunch of data—it could be images, emails, financial indicators, sounds, or anything else. There are many datasets made freely available that are suitable for training models. Many of these datasets are created by academics to help advance research, many are made available for competitions (there are competitions where data scientists can compete to see who has the most accurate model!), and some are by-products of other processes (such as financial filings)." + "The notebooks are labeled by chapter and then by notebook number, so that they are in the same order as they are presented in this book. So, the very first notebook you will see listed is the notebook that you need to use now. You will be using this notebook to train a model that can recognize dog and cat photos. To do this, you'll be downloading a _dataset_ of dog and cat photos, and using that to _train a model_. A dataset is simply a bunch of data—it could be images, emails, financial indicators, sounds, or anything else. There are many datasets made freely available that are suitable for training models. Many of these datasets are created by academics to help advance research, many are made available for competitions (there are competitions where data scientists can compete to see who has the most accurate model!), and some are by-products of other processes (such as financial filings)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> note: There are two folders containing different versions of the notebooks. The **full** folder contains the exact notebooks used to create the book you're reading now, with all the prose and outputs. The **stripped** version has the same headings and code cells, but all outputs and prose have been removed. After reading a section of the book, we recommend working through the stripped notebooks, with the book closed, and see if you can figure out what each cell will show before you execute it. And try to recall what the code is demonstrating." + "> note: Full and Stripped Notebooks: There are two folders containing different versions of the notebooks. The _full_ folder contains the exact notebooks used to create the book you're reading now, with all the prose and outputs. The _stripped_ version has the same headings and code cells, but all outputs and prose have been removed. After reading a section of the book, we recommend working through the stripped notebooks, with the book closed, and seeing if you can figure out what each cell will show before you execute it. Also try to recall what the code is demonstrating." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To open a notebook, just click on it. The notebook will open, and it will look something like <> (note that there may be slight differences in details across different platforms; you can ignore those differences):" + "To open a notebook, just click on it. The notebook will open, and it will look something like <> (note that there may be slight differences in details across different platforms; you can ignore those differences)." ] }, { @@ -461,27 +461,27 @@ "source": [ "A notebook consists of _cells_. There are two main types of cell:\n", "\n", - "- Cells containing formatted text, images, and so forth. These use a format called *markdown*, which we will learn about soon\n", - "- Cells containing code, which can be executed, and outputs will appear immediately underneath (which could be plain text, tables, images, animations, sounds, or even interactive applications)\n", + "- Cells containing formatted text, images, and so forth. These use a format called *markdown*, which you will learn about soon.\n", + "- Cells containing code that can be executed, and outputs will appear immediately underneath (which could be plain text, tables, images, animations, sounds, or even interactive applications).\n", "\n", - "Jupyter notebooks can be in one of two modes, edit mode, or command mode. In edit mode typing the keys on your keyboard types the letters into the cell in the usual way. However, in command mode, you will not see any flashing cursor, and the keys on your keyboard will each have a special function.\n", + "Jupyter notebooks can be in one of two modes: edit mode or command mode. In edit mode typing on your keyboard enters the letters into the cell in the usual way. However, in command mode, you will not see any flashing cursor, and the keys on your keyboard will each have a special function.\n", "\n", - "Let's make sure that you are in command mode before continuing: press \"escape\" now on your keyboard to switch to command mode (if you are already in command mode, then this does nothing, so press it now just in case). To see a complete list of all of the functions available, press \"h\"; press \"escape\" to remove this help screen. Notice that in command mode, unlike most programs, commands do not require you to hold down \"control\", \"alt\", or similar — you simply press the required letter key.\n", + "Before continuing, press the Escape key on your keyboard to switch to command mode (if you are already in command mode, this does nothing, so press it now just in case). To see a complete list of all of the functions available, press H; press Escape to remove this help screen. Notice that in command mode, unlike most programs, commands do not require you to hold down Control, Alt, or similar—you simply press the required letter key.\n", "\n", - "You can make a copy of a cell by pressing \"c\" (it needs to be selected first, indicated with an outline around the cell; if it is not already selected, click on it once). Then press \"v\" to paste a copy of it." + "You can make a copy of a cell by pressing C (the cell needs to be selected first, indicated with an outline around it; if it is not already selected, click on it once). Then press V to paste a copy of it." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "When you click on a cell it will be selected. Click on the cell now which begins with the line \"# CLICK ME\". The first character in that line represents a comment in Python, so is ignored when executing the cell. The rest of the cell is, believe it or not, a complete system for creating and training a state-of-the-art model for recognizing cats versus dogs. So, let's train it now! To do so, just press shift-enter on your keyboard, or press the \"play\" button on the toolbar. Then, wait a few minutes while the following things happen:\n", + "Click on the cell that begins with the line \"# CLICK ME\" to select it. The first character in that line indicates that what follows is a comment in Python, so it is ignored when executing the cell. The rest of the cell is, believe it or not, a complete system for creating and training a state-of-the-art model for recognizing cats versus dogs. So, let's train it now! To do so, just press Shift-Enter on your keyboard, or press the Play button on the toolbar. Then wait a few minutes while the following things happen:\n", "\n", - "1. A dataset called the [Oxford-IIIT Pet Dataset](http://www.robots.ox.ac.uk/~vgg/data/pets/) that contains 7,349 images of cats and dogs from 37 different breeds will be downloaded from the fast.ai datasets collection to the GPU server you are using, and will then be extracted\n", - "2. A *pretrained model* will be downloaded from the Internet, which has already been trained on 1.3 million images, using a competition winning model\n", - "3. The pretrained model will be *fine-tuned* using the latest advances in transfer learning, to create a model that is specially customised for recognising dogs and cats\n", + "1. A dataset called the [Oxford-IIIT Pet Dataset](http://www.robots.ox.ac.uk/~vgg/data/pets/) that contains 7,349 images of cats and dogs from 37 different breeds will be downloaded from the fast.ai datasets collection to the GPU server you are using, and will then be extracted.\n", + "2. A *pretrained model* that has already been trained on 1.3 million images, using a competition-winning model will be downloaded from the internet.\n", + "3. The pretrained model will be *fine-tuned* using the latest advances in transfer learning, to create a model that is specially customized for recognizing dogs and cats.\n", "\n", - "The first two steps only need to be run once on your GPU server. If you run the cell again, it will use the dataset and model that have already been downloaded, rather than downloading them again." + "The first two steps only need to be run once on your GPU server. If you run the cell again, it will use the dataset and model that have already been downloaded, rather than downloading them again. Let's take a look at the contents of the cell, and the results (<>):" ] }, { @@ -572,28 +572,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You will probably not see exactly the same results that are in the book. There are a lot of sources of small random variation involved in training models. We generally see an error rate of well less than 0.02 in this example." + "You will probably not see exactly the same results that are in the book. There are a lot of sources of small random variation involved in training models. We generally see an error rate of well less than 0.02 in this example, however." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> important: Depending on your network speed, it might take a few minutes to download the pretrained model and dataset. Running `fine_tune` might take a minute or so. Often models in this book take a few minutes to train, as will your own models. So it's a good idea to come up with good techniques to make the most of this time. For instance, keep reading the next section while your model trains, or open up another notebook and use it for some coding experiments." + "> important: Trianing Time: Depending on your network speed, it might take a few minutes to download the pretrained model and dataset. Running `fine_tune` might take a minute or so. Often models in this book take a few minutes to train, as will your own models, so it's a good idea to come up with good techniques to make the most of this time. For instance, keep reading the next section while your model trains, or open up another notebook and use it for some coding experiments." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: This book was written in Jupyter Notebooks" + "### Sidebar: This Book Was Written in Jupyter Notebooks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We wrote this book using Jupyter Notebooks, so for nearly every chart, table, and calculation in this book, we'll be showing you all the exact code required to replicate it yourself. That's why very often in this book, you will see some code immediately followed by a table, a picture or just some text. If you go on the [book website](https://book.fast.ai) you will find all the code and you can try running and modifying every example yourself." + "We wrote this book using Jupyter notebooks, so for nearly every chart, table, and calculation in this book, we'll be showing you the exact code required to replicate it yourself. That's why very often in this book, you will see some code immediately followed by a table, a picture or just some text. If you go on the [book's website](https://book.fast.ai) you will find all the code, and you can try running and modifying every example yourself." ] }, { @@ -663,9 +663,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "So, how do we know if this model is any good? In the last column of the table you can see the error rate, which is the proportion of images that were incorrectly identified. The error rate serves as our metric -- our measure of model quality, chosen to be intuitive and comprehensible. As you can see, the model is nearly perfect, even though the training time was only a few seconds (not including the one-time downloading of the dataset and the pretrained model). In fact, the accuracy you've achieved already is far better than anybody had ever achieved just 10 years ago!\n", + "So, how do we know if this model is any good? In the last column of the table you can see the error rate, which is the proportion of images that were incorrectly identified. The error rate serves as our metric--our measure of model quality, chosen to be intuitive and comprehensible. As you can see, the model is nearly perfect, even though the training time was only a few seconds (not including the one-time downloading of the dataset and the pretrained model). In fact, the accuracy you've achieved already is far better than anybody had ever achieved just 10 years ago!\n", "\n", - "Finally, let's check that this model actually works. Go and get a photo of a dog, or a cat; if you don't have one handy, just search Google images and download an image that you find there. Now execute the cell with `uploader` defined. It will output a button you can click, so you can select the image you want to classify." + "Finally, let's check that this model actually works. Go and get a photo of a dog, or a cat; if you don't have one handy, just search Google Images and download an image that you find there. Now execute the cell with `uploader` defined. It will output a button you can click, so you can select the image you want to classify:" ] }, { @@ -705,7 +705,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we can pass the uploaded file to the model. The notebook will tell you whether it thinks it is a dog, or a cat, and how confident it is. Make sure that it is a clear photo of a single dog or a cat, and not a line drawing, cartoon, or similar. Hopefully, you'll find that your model did a great job!" + "Now you can pass the uploaded file to the model. Make sure that it is a clear photo of a single dog or a cat, and not a line drawing, cartoon, or similar. The notebook will tell you whether it thinks it is a dog or a cat, and how confident it is. Hopefully, you'll find that your model did a great job:" ] }, { @@ -758,14 +758,14 @@ "source": [ "Congratulations on your first classifier!\n", "\n", - "But what does this mean? But what did we actually do? In order to explain this, let's zoom out again to take in the big picture. " + "But what does this mean? What did you actually do? In order to explain this, let's zoom out again to take in the big picture. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### What is machine learning?" + "### What Is Machine Learning?" ] }, { @@ -776,11 +776,11 @@ "\n", "Another key piece of context is that deep learning is just a modern area in the more general discipline of *machine learning*. To understand the essence of what you did when you trained your own classification model, you don't need to understand deep learning. It is enough to see how your model and your training process are examples of the concepts that apply to machine learning in general.\n", "\n", - "So in this section, we will describe what machine learning is. We will introduce the key concepts, and see how they can be traced back to the original essay that introduced the concept.\n", + "So in this section, we will describe what machine learning is. We will look at the key concepts, and show how they can be traced back to the original essay that introduced them.\n", "\n", - "*Machine learning* is, like regular programming, a way to get computers to complete a specific task. But how would you use regular programming to do what we just did in the last section: recognize dogs vs cats in photos? We would have to write down for the computer the exact steps necessary to complete the task.\n", + "*Machine learning* is, like regular programming, a way to get computers to complete a specific task. But how would we use regular programming to do what we just did in the last section: recognize dogs versus cats in photos? We would have to write down for the computer the exact steps necessary to complete the task.\n", "\n", - "Normally, it's easy enough for us to write down the steps to complete a task when we're writing a program. We just think about the steps we'd take if we had to do the task by hand, and then we translate them into code. For instance, we can write a function that sorts a list. In general, we write a function that looks something like <> (where *inputs* might be an unsorted list, and *results* a sorted list)." + "Normally, it's easy enough for us to write down the steps to complete a task when we're writing a program. We just think about the steps we'd take if we had to do the task by hand, and then we translate them into code. For instance, we can write a function that sorts a list. In general, we'd write a function that looks something like <> (where *inputs* might be an unsorted list, and *results* a sorted list)." ] }, { @@ -862,30 +862,30 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "But for recognizing objects in a photo that's a bit tricky; what *are* the steps we take exactly when we recognize an object in a picture? We really don't know, since it all happens in our brain without us being consciously aware of it!\n", + "But for recognizing objects in a photo that's a bit tricky; what *are* the steps we take when we recognize an object in a picture? We really don't know, since it all happens in our brain without us being consciously aware of it!\n", "\n", - "Right back at the dawn of computing, in 1949, an IBM researcher named Arthur Samuel started working on a different way to get computers to complete tasks, which he called *machine learning*. In his classic 1962 essay *Artificial Intelligence: A Frontier of Automation*, he wrote:" + "Right back at the dawn of computing, in 1949, an IBM researcher named Arthur Samuel started working on a different way to get computers to complete tasks, which he called *machine learning*. In his classic 1962 essay \"Artificial Intelligence: A Frontier of Automation\", he wrote:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> : _Programming a computer for such computations is, at best, a difficult task, not primarily because of any inherent complexity in the computer itself but, rather, because of the need to spell out every minute step of the process in the most exasperating detail. Computers, as any programmer will tell you, are giant morons, not giant brains._" + "> : Programming a computer for such computations is, at best, a difficult task, not primarily because of any inherent complexity in the computer itself but, rather, because of the need to spell out every minute step of the process in the most exasperating detail. Computers, as any programmer will tell you, are giant morons, not giant brains." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "His basic idea was this: instead of telling the computer the exact steps required to solve a problem, instead, show it examples of the problem to solve, and let it figure out how to solve it itself. This turned out to be very effective: by 1961 his checkers playing program had learned so much that it beat the Connecticut state champion! Here's how he described his idea (from the same essay as above):" + "His basic idea was this: instead of telling the computer the exact steps required to solve a problem, show it examples of the problem to solve, and let it figure out how to solve it itself. This turned out to be very effective: by 1961 his checkers-playing program had learned so much that it beat the Connecticut state champion! Here's how he described his idea (from the same essay as above):" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> : _Suppose we arrange for some automatic means of testing the effectiveness of any current weight assignment in terms of actual performance and provide a mechanism for altering the weight assignment so as to maximize the performance. We need not go into the details of such a procedure to see that it could be made entirely automatic and to see that a machine so programmed would \"learn\" from its experience._" + "> : Suppose we arrange for some automatic means of testing the effectiveness of any current weight assignment in terms of actual performance and provide a mechanism for altering the weight assignment so as to maximize the performance. We need not go into the details of such a procedure to see that it could be made entirely automatic and to see that a machine so programmed would \"learn\" from its experience." ] }, { @@ -894,16 +894,16 @@ "source": [ "There are a number of powerful concepts embedded in this short statement: \n", "\n", - "- the idea of a \"weight assignment\" \n", - "- the fact that every weight assignment has some \"actual performance\"\n", - "- the requirement that there is an \"automatic means\" of testing that performance, \n", - "- and last, that there is a \"mechanism\" (i.e., another automatic process) for improving the performance by changing the weight assignments.\n", + "- The idea of a \"weight assignment\" \n", + "- The fact that every weight assignment has some \"actual performance\"\n", + "- The requirement that there be an \"automatic means\" of testing that performance, \n", + "- The need for a \"mechanism\" (i.e., another automatic process) for improving the performance by changing the weight assignments\n", "\n", "Let us take these concepts one by one, in order to understand how they fit together in practice. First, we need to understand what Samuel means by a *weight assignment*.\n", "\n", - "Weights are just variables, and a weight assignment is a particular choice of values for those variables. The program's inputs are values that it processes in order to produce its results -- for instance, taking image pixels as inputs, and returning the classification \"dog\" as a result. But the program's weight assignments are other values which define how the program will operate.\n", + "Weights are just variables, and a weight assignment is a particular choice of values for those variables. The program's inputs are values that it processes in order to produce its results--for instance, taking image pixels as inputs, and returning the classification \"dog\" as a result. The program's weight assignments are other values that define how the program will operate.\n", "\n", - "Since they will affect the program they are in a sense another kind of input, so we will update our basic picture of <> and replace it with <> in order to take this into account:" + "Since they will affect the program they are in a sense another kind of input, so we will update our basic picture in <> and replace it with <> in order to take this into account." ] }, { @@ -998,13 +998,13 @@ "source": [ "We've changed the name of our box from *program* to *model*. This is to follow modern terminology and to reflect that the *model* is a special kind of program: it's one that can do *many different things*, depending on the *weights*. It can be implemented in many different ways. For instance, in Samuel's checkers program, different values of the weights would result in different checkers-playing strategies. \n", "\n", - "(By the way, what Samuel called *weights* are most generally refered to as model *parameters* these days, in case you have encountered that term. The term *weights* is reserved for a particular type of model parameter.)\n", + "(By the way, what Samuel called \"weights\" are most generally refered to as model *parameters* these days, in case you have encountered that term. The term *weights* is reserved for a particular type of model parameter.)\n", "\n", - "Next, he said we need an *automatic means of testing the effectiveness of any current weight assignment in terms of actual performance*. In the case of his checkers program, the \"actual performance\" of a model would be how well it plays. And you could automatically test the performance of two models by setting them to play against each other, and see which one usually wins.\n", + "Next, Samuel said we need an *automatic means of testing the effectiveness of any current weight assignment in terms of actual performance*. In the case of his checkers program, the \"actual performance\" of a model would be how well it plays. And you could automatically test the performance of two models by setting them to play against each other, and seeing which one usually wins.\n", "\n", - "Finally, he says we need *a mechanism for altering the weight assignment so as to maximize the performance*. For instance, we could look at the difference in weights between the winning model and the losing model, and adjust the weights a little further in the winning *direction*.\n", + "Finally, he says we need *a mechanism for altering the weight assignment so as to maximize the performance*. For instance, we could look at the difference in weights between the winning model and the losing model, and adjust the weights a little further in the winning direction.\n", "\n", - "We can now see why he said that such a procedure *could be made entirely automatic and... a machine so programed would \"learn\" from its experience*. Learning would become entirely automatic when the adjustment of the weights was also automatic -- when instead of us improving a model by adjusting its weights manually, we relied on an automated mechanism that produced adjustments based on performance.\n", + "We can now see why he said that such a procedure *could be made entirely automatic and... a machine so programmed would \"learn\" from its experience*. Learning would become entirely automatic when the adjustment of the weights was also automatic--when instead of us improving a model by adjusting its weights manually, we relied on an automated mechanism that produced adjustments based on performance.\n", "\n", "<> shows the full picture of Samuel's idea of training a machine learning model." ] @@ -1123,9 +1123,9 @@ "source": [ "Notice the distinction between the model's *results* (e.g., the moves in a checkers game) and its *performance* (e.g., whether it wins the game, or how quickly it wins). \n", "\n", - "Also note that once the model is trained -- that is, once we've chosen our final, best, favorite weight assignment -- then we can think of the weights as being *part of the model*, since we're not varying them any more.\n", + "Also note that once the model is trained--that is, once we've chosen our final, best, favorite weight assignment--then we can think of the weights as being *part of the model*, since we're not varying them any more.\n", "\n", - "Therefore actually *using* a model after it's trained looks like <>." + "Therefore, actually *using* a model after it's trained looks like <>." ] }, { @@ -1206,7 +1206,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This looks identical to our original diagram in <>, just with the word *program* replaced with *model*. This is an important insight: **a trained model can be treated just like a regular computer program**." + "This looks identical to our original diagram in <>, just with the word *program* replaced with *model*. This is an important insight: *a trained model can be treated just like a regular computer program*." ] }, { @@ -1220,7 +1220,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### What is a neural network?" + "### What Is a Neural Network?" ] }, { @@ -1229,18 +1229,18 @@ "source": [ "It's not too hard to imagine what the model might look like for a checkers program. There might be a range of checkers strategies encoded, and some kind of search mechanism, and then the weights could vary how strategies are selected, what parts of the board are focused on during a search, and so forth. But it's not at all obvious what the model might look like for an image recognition program, or for understanding text, or for many other interesting problems we might imagine.\n", "\n", - "What we would like is some kind of function that is so flexible that it could be used to solve any given problem, just by varying its weights. Amazingly enough, this function actually exists! It's the neural network, which we already discussed. That is, if you regard a neural network as a mathematical function, it turns out to be a function which is extremely flexible depending on its weights. A mathematical proof called the *universal approximation theorem* shows that this function can solve any problem to any level of accuracy, in theory. The fact that neural networks are so flexible means that, in practice, they are often a suitable kind of model, and you can focus your effort on the process of training them, that is, of finding good weight assignments.\n", + "What we would like is some kind of function that is so flexible that it could be used to solve any given problem, just by varying its weights. Amazingly enough, this function actually exists! It's the neural network, which we already discussed. That is, if you regard a neural network as a mathematical function, it turns out to be a function which is extremely flexible depending on its weights. A mathematical proof called the *universal approximation theorem* shows that this function can solve any problem to any level of accuracy, in theory. The fact that neural networks are so flexible means that, in practice, they are often a suitable kind of model, and you can focus your effort on the process of training them--that is, of finding good weight assignments.\n", "\n", "But what about that process? One could imagine that you might need to find a new \"mechanism\" for automatically updating weight for every problem. This would be laborious. What we'd like here as well is a completely general way to update the weights of a neural network, to make it improve at any given task. Conveniently, this also exists!\n", "\n", - "This is called *stochastic gradient descent* (SGD). We'll see how neural networks and SGD work in detail in <>, as well as explaining the universal approximation theorem. For now, however, we will instead use Samuel's own words: *We need not go into the details of such a procedure to see that it could be made entirely automatic and to see that a machine so programed would \"learn\" from its experience.*" + "This is called *stochastic gradient descent* (SGD). We'll see how neural networks and SGD work in detail in <>, as well as explaining the universal approximation theorem. For now, however, we will instead use Samuel's own words: *We need not go into the details of such a procedure to see that it could be made entirely automatic and to see that a machine so programmed would \"learn\" from its experience.*" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> J: Don't worry, neither SGD nor neural nets are mathematically complex. Both SGD and neural nets nearly entirely rely on addition and multiplication to do their work (but they do a *lot* of addition and multiplication!) The main reaction we hear from students when they see the details is: \"is that all it is?\"" + "> J: Don't worry, neither SGD nor neural nets are mathematically complex. Both nearly entirely rely on addition and multiplication to do their work (but they do a _lot_ of addition and multiplication!). The main reaction we hear from students when they see the details is: \"Is that all it is?\"" ] }, { @@ -1251,7 +1251,7 @@ "\n", "Having zoomed out, let's now zoom back in and revisit our image classification problem using Samuel's framework.\n", "\n", - "Our inputs, those are the images. Our weights, those are the weights in the neural net. Our model is a neural net. Our results -- those are the values that are calculated by the neural net, like \"dog\" or \"cat\".\n", + "Our inputs are the images. Our weights are the weights in the neural net. Our model is a neural net. Our results are the values that are calculated by the neural net, like \"dog\" or \"cat.\"\n", "\n", "What about the next piece, an *automatic means of testing the effectiveness of any current weight assignment in terms of actual performance*? Determining \"actual performance\" is easy enough: we can simply define our model's performance as its accuracy at predicting the correct answers.\n", "\n", @@ -1262,21 +1262,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "#### A bit of deep learning jargon" + "### A Bit of Deep Learning Jargon" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Samuel was working in the 1960s but terminology has changed. Here is the modern deep learning terminology for all the pieces we have discussed:\n", + "Samuel was working in the 1960s, and since then terminology has changed. Here is the modern deep learning terminology for all the pieces we have discussed:\n", "\n", - "- The functional form of the *model* is called its *architecture* (but be careful--sometimes people use *model* as a synonym of *architecture*, so this can get confusing) ;\n", - "- The *weights* are called *parameters* ;\n", - "- The *predictions* are calculated from the *independent variables*, which is the *data* not including the *labels* ; \n", - "- The *results* of the model are called *predictions* ;\n", - "- The measure of *performance* is called the *loss*;\n", - "- The loss depends not only on the predictions, but also the correct *labels* (also known as *targets* or *dependent variable*), e.g. \"dog\" or \"cat\".\n", + "- The functional form of the *model* is called its *architecture* (but be careful--sometimes people use *model* as a synonym of *architecture*, so this can get confusing).\n", + "- The *weights* are called *parameters*.\n", + "- The *predictions* are calculated from the *independent variable*, which is the *data* not including the *labels*.\n", + "- The *results* of the model are called *predictions*.\n", + "- The measure of *performance* is called the *loss*.\n", + "- The loss depends not only on the predictions, but also the correct *labels* (also known as *targets* or the *dependent variable*); e.g., \"dog\" or \"cat.\"\n", "\n", "After making these changes, our diagram in <> looks like <>." ] @@ -1404,46 +1404,47 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Limitations inherent to machine learning\n", + "### Limitations Inherent To Machine Learning\n", "\n", "From this picture we can now see some fundamental things about training a deep learning model:\n", "\n", - "- A model cannot be created without data ;\n", - "- A model can only learn to operate on the patterns seen in the input data used to train it ;\n", - "- This learning approach only creates *predictions*, not recommended *actions* ;\n", - "- It's not enough to just have examples of input data; we need *labels* for that data too (e.g. pictures of dogs and cats aren't enough to train a model; we need a label for each one, saying which ones are dogs, and which are cats).\n", + "- A model cannot be created without data.\n", + "- A model can only learn to operate on the patterns seen in the input data used to train it.\n", + "- This learning approach only creates *predictions*, not recommended *actions*.\n", + "- It's not enough to just have examples of input data; we need *labels* for that data too (e.g., pictures of dogs and cats aren't enough to train a model; we need a label for each one, saying which ones are dogs, and which are cats).\n", "\n", - "Generally speaking, we've seen that most organizations that think they don't have enough data, actually mean they don't have enough *labeled* data. If any organization is interested in doing something in practice with a model, then presumably they have some inputs they plan to run their model against. And presumably they've been doing that some other way for a while (e.g. manually, or with some heuristic program), so they have data from those processes! For instance, a radiology practice will almost certainly have an archive of medical scans (since they need to be able to check how their patients are progressing over time), but those scans may not have structured labels containing a list of diagnoses or interventions (since radiologists generally create free text natural language reports, not structured data). We'll be discussing labeling approaches a lot in this book, since it's such an important issue in practice.\n", + "Generally speaking, we've seen that most organizations that say they don't have enough data, actually mean they don't have enough *labeled* data. If any organization is interested in doing something in practice with a model, then presumably they have some inputs they plan to run their model against. And presumably they've been doing that some other way for a while (e.g., manually, or with some heuristic program), so they have data from those processes! For instance, a radiology practice will almost certainly have an archive of medical scans (since they need to be able to check how their patients are progressing over time), but those scans may not have structured labels containing a list of diagnoses or interventions (since radiologists generally create free-text natural language reports, not structured data). We'll be discussing labeling approaches a lot in this book, because it's such an important issue in practice.\n", "\n", - "Since these kinds of machine learning models can only make *predictions* (i.e. attempt to replicate labels), this can result in a significant gap between organizational goals and model capabilities. For instance, in this book you'll learn how to create a *recommendation system* that can predict what products a user might purchase. This is often used in e-commerce, such as to customize products shown on a home page, by showing the highest-ranked items. But such a model is generally created by looking at a user and their buying history (*inputs*) and what they went on to buy or look at (*labels*), which means that the model is likely to tell you about products they already have, or already know about, rather than new products that they are most likely to be interested in hearing about. That's very different to what, say, an expert at your local bookseller might do, where they ask questions to figure out your taste, and then tell you about authors or series that you've never heard of before." + "Since these kinds of machine learning models can only make *predictions* (i.e., attempt to replicate labels), this can result in a significant gap between organizational goals and model capabilities. For instance, in this book you'll learn how to create a *recommendation system* that can predict what products a user might purchase. This is often used in e-commerce, such as to customize products shown on a home page by showing the highest-ranked items. But such a model is generally created by looking at a user and their buying history (*inputs*) and what they went on to buy or look at (*labels*), which means that the model is likely to tell you about products the user already has or already knows about, rather than new products that they are most likely to be interested in hearing about. That's very different to what, say, an expert at your local bookseller might do, where they ask questions to figure out your taste, and then tell you about authors or series that you've never heard of before." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Another critical insight comes from considering how a model interacts with its environment. For instance, this can create feedback loops, such as:\n", + "Another critical insight comes from considering how a model interacts with its environment. This can create *feedback loops*, as described here:\n", "\n", - "- A *predictive policing* model is created based on where arrests have been made in the past. In practice, this is not actually predicting crime, but rather predicting arrests, and is therefore partially simply reflecting biases in existing policing processes;\n", - "- Law enforcement officers then might use that model to decide where to focus their police activity, resulting in increased arrests in those areas;\n", - "- These additional arrests would then feed back to re-training future versions of the model;\n", - "- This is a *positive feedback loop*, where the more the model is used, the more biased the data becomes, making the model even more biased, and so forth.\n", + "- A *predictive policing* model is created based on where arrests have been made in the past. In practice, this is not actually predicting crime, but rather predicting arrests, and is therefore partially simply reflecting biases in existing policing processes.\n", + "- Law enforcement officers then might use that model to decide where to focus their police activity, resulting in increased arrests in those areas.\n", + "- Data on these additional arrests would then be fed back in to retrain future versions of the model.\n", "\n", - "This can also create problems in commercial products. For instance, a video recommendation system might be biased towards recommending content consumed by the biggest watchers of video (for instance, conspiracy theorists and extremists tend to watch more online video content than average), resulting in those users increasing their video consumption, resulting in more of those kinds of videos being recommended..." + "This is a *positive feedback loop*, where the more the model is used, the more biased the data becomes, making the model even more biased, and so forth.\n", + "\n", + "Feedback loops can also create problems in commercial settings. For instance, a video recommendation system might be biased toward recommending content consumed by the biggest watchers of video (e.g., conspiracy theorists and extremists tend to watch more online video content than the average), resulting in those users increasing their video consumption, resulting in more of those kinds of videos being recommended. We'll consider this topic more in detail in <>." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now that we have seen the base of the theory, let's go back to our code example and see in detail how the code corresponds to the process we just described." + "Now that you have seen the base of the theory, let's go back to our code example and see in detail how the code corresponds to the process we just described." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### How our image recognizer works" + "### How Our Image Recognizer Works" ] }, { @@ -1457,11 +1458,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ + "The first line imports all of the fastai.vision library.\n", + "\n", "```python\n", "from fastai2.vision.all import *\n", "```\n", "\n", - "The first line imports all of the fastai.vision library. This gives us all of the functions and classes we will need to create a wide variety of computer vision models." + "This gives us all of the functions and classes we will need to create a wide variety of computer vision models." ] }, { @@ -1475,55 +1478,66 @@ "cell_type": "markdown", "metadata": {}, "source": [ + "The second line downloads a standard dataset from the [fast.ai datasets collection](https://course.fast.ai/datasets) (if not previously downloaded) to your server, extracts it (if not previously extracted), and returns a `Path` object with the extracted location:\n", + "\n", "```python\n", "path = untar_data(URLs.PETS)/'images'\n", "```\n", "\n", - "The second line downloads a standard dataset from the [fast.ai datasets collection](https://course.fast.ai/datasets) (if not previously downloaded) to your server, extracts it (if not previously extracted), and returns a `Path` object with the extracted location.\n", - "\n", - "> S: Throughout my time studying fast.ai, and even still today, I've learned a lot about productive coding practices. The fastai library and fast.ai notebooks are full of great little tips that have helped make me a better programmer. For instance, notice that the fastai library doesn't just return a string containing the path to the dataset, but a Path object. This is a really useful class from the Python 3 standard library that makes accessing files and directories much easier. If you haven't come across it before, be sure to check out its documentation or a tutorial and try it out. Note that the book.fast.ai website contains links to recommended tutorials for each chapter. I'll keep letting you know about little coding tips I've found useful as we come across them." + "> S: Throughout my time studying at fast.ai, and even still today, I've learned a lot about productive coding practices. The fastai library and fast.ai notebooks are full of great little tips that have helped make me a better programmer. For instance, notice that the fastai library doesn't just return a string containing the path to the dataset, but a `Path` object. This is a really useful class from the Python 3 standard library that makes accessing files and directories much easier. If you haven't come across it before, be sure to check out its documentation or a tutorial and try it out. Note that the https://book.fast.ai[website] contains links to recommended tutorials for each chapter. I'll keep letting you know about little coding tips I've found useful as we come across them." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ + "In the third line we define a function, `is_cat`, labels cats based on a filename rule provided by the dataset creators:\n", "```python\n", "def is_cat(x): return x[0].isupper()\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We use that function in the fourth line, which tells fastai what kind of dataset we have, and how it is structured:\n", + "\n", + "```python\n", "dls = ImageDataLoaders.from_name_func(\n", " path, get_image_files(path), valid_pct=0.2, seed=42,\n", " label_func=is_cat, item_tfms=Resize(224))\n", "```\n", "\n", - "The fourth line tells fastai what kind of dataset we have, and how it is structured. There are various different classes for different kinds of deep learning datasets and problems--here we're using `ImageDataLoaders`. The first part of the class name will generally be the type of data you have, such as image, or text. The second part will generally be the type of problem you are solving, such as classification, or regression.\n", + "There are various different classes for different kinds of deep learning datasets and problems--here we're using `ImageDataLoaders`. The first part of the class name will generally be the type of data you have, such as image, or text.\n", "\n", - "The other important piece of information that we have to tell fastai is how to get the labels from the dataset. Computer vision datasets are normally structured in such a way that the label for an image is part of the file name, or path, most commonly the parent folder name. Fastai comes with a number of standardized labelling methods, and ways to write your own. Here we define a function on the third line: `is_cat` which labels cats based on a filename rule provided by the dataset creators.\n", + "The other important piece of information that we have to tell fastai is how to get the labels from the dataset. Computer vision datasets are normally structured in such a way that the label for an image is part of the filename, or path--most commonly the parent folder name. Fastai comes with a number of standardized labeling methods, and ways to write your own. Here we're telling fastai to use the `is_cat` function we just defined.\n", "\n", - "Finally, we define the `Transform`s that we need. A `Transform` contains code that is applied automatically during training; fastai includes many pre-defined `Transform`s, and adding new ones is as simple as creating a Python function. There are two kinds: `item_tfms` are applied to each item (in this case, each item is resized to a 224 pixel square); `batch_tfms` are applied to a *batch* of items at a time using the GPU, so they're particularly fast (we'll see many examples of these throughout this book).\n", + "Finally, we define the `Transform`s that we need. A `Transform` contains code that is applied automatically during training; fastai includes many predefined `Transform`s, and adding new ones is as simple as creating a Python function. There are two kinds: `item_tfms` are applied to each item (in this case, each item is resized to a 224-pixel square), while `batch_tfms` are applied to a *batch* of items at a time using the GPU, so they're particularly fast (we'll see many examples of these throughout this book).\n", "\n", - "Why 224 pixels? This is the standard size for historical reasons (old pretrained models require this size exactly), but you can pass pretty much anything. If you increase the size, you'll often get a model with better results (since it will be able to focus on more details) but at the price of speed and memory consumption; or vice versa if you decrease the size. " + "Why 224 pixels? This is the standard size for historical reasons (old pretrained models require this size exactly), but you can pass pretty much anything. If you increase the size, you'll often get a model with better results (since it will be able to focus on more details), but at the price of speed and memory consumption; the opposite is true if you decrease the size. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> Note: _classification_ and _regression_ have very specific meanings in machine learning. These are the two main types of model that we will be investigating in this book. A classification model is one which attempts to predict a class, or category. That is, predicting from a number of discrete possibilities, such as \"dog\" or \"cat\". A regression model is one which attempts to predict one or more numeric quantities, such as temperature, or a location. Sometimes people use the word _regression_ as a shortcut to a particular kind of model called a _linear regression model_; this is a bad practice, and we won't be using that terminology in this book!" + "> Note: Classification and Regression: _classification_ and _regression_ have very specific meanings in machine learning. These are the two main types of model that we will be investigating in this book. A classification model is one which attempts to predict a class, or category. That is, it's predicting from a number of discrete possibilities, such as \"dog\" or \"cat.\" A regression model is one which attempts to predict one or more numeric quantities, such as a temperature or a location. Sometimes people use the word _regression_ to refer to a particular kind of model called a _linear regression model_; this is a bad practice, and we won't be using that terminology in this book!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The pets dataset contains 7390 pictures of dogs and cats, consisting of 37 different breeds. Each image is labeled using its filename, for instance the file `great_pyrenees_173.jpg` is the 173rd example of an image of a great pyrenees breed dog in the dataset. The filenames start with an uppercase letter if the image is a cat, and a lowercase letter otherwise. We have to tell fastai how to get labels from the filenames, which we do by calling `from_name_func` (which means that filenames can be extracted using a function applied to the file name), and passing `x[0].isupper()`, which evaluates to `True` if the first letter is uppercase (i.e. it's a cat).\n", + "The Pet dataset contains 7,390 pictures of dogs and cats, consisting of 37 different breeds. Each image is labeled using its filename: for instance the file *great\\_pyrenees\\_173.jpg* is the 173rd example of an image of a Great Pyrenees breed dog in the dataset. The filenames start with an uppercase letter if the image is a cat, and a lowercase letter otherwise. We have to tell fastai how to get labels from the filenames, which we do by calling `from_name_func` (which means that filenames can be extracted using a function applied to the filename), and passing `x[0].isupper()`, which evaluates to `True` if the first letter is uppercase (i.e., it's a cat).\n", "\n", - "The most important parameter to mention here is `valid_pct=0.2`. This tells fastai to hold out 20% of the data and *not use it for training the model at all*. This 20% of the data is called the *validation set*; the remaining 80% is called the *training set*. The validation set is used to measure the accuracy of the model. By default, the 20% that is held out is selected randomly. The parameter `seed=42` sets the *random seed* to the same value every time we run this code, which means we get the same validation set every time we run this code--that way, if you change your model and re-train it, you know that changes are due to your model, not due to having a different random validation set.\n", + "The most important parameter to mention here is `valid_pct=0.2`. This tells fastai to hold out 20% of the data and *not use it for training the model at all*. This 20% of the data is called the *validation set*; the remaining 80% is called the *training set*. The validation set is used to measure the accuracy of the model. By default, the 20% that is held out is selected randomly. The parameter `seed=42` sets the *random seed* to the same value every time we run this code, which means we get the same validation set every time we run it--this way, if we change our model and retrain it, we know that any differences are due to the changes to the model, not due to having a different random validation set.\n", "\n", - "fastai will *always* show you your model's accuracy using *only* the validation set, *never* the training set. This is absolutely critical, because if you train a large enough model for a long enough time, it will eventually learn to *memorize* the label of every item in your dataset! This is not actually a useful model, because what we care about is how well our model works on *previously unseen images*. That is always our goal when creating a model: to be useful on data that the model only sees in the future, after it has been trained.\n", + "fastai will *always* show you your model's accuracy using *only* the validation set, *never* the training set. This is absolutely critical, because if you train a large enough model for a long enough time, it will eventually memorize the label of every item in your dataset! The result will not actually be a useful model, because what we care about is how well our model works on *previously unseen images*. That is always our goal when creating a model: for it to be useful on data that the model only sees in the future, after it has been trained.\n", "\n", - "Even when your model has not fully memorized all your data, earlier on in training it may have memorized certain parts of it. As a result, the longer you train for, the better your accuracy will get on the training set; and the validation set accuracy will also improve for a while, but eventually it will start getting worse, as the model starts to memorize the training set, rather than finding generalizable underlying patterns in the data. When this happens, we say that the model is *over-fitting*.\n", + "Even when your model has not fully memorized all your data, earlier on in training it may have memorized certain parts of it. As a result, the longer you train for, the better your accuracy will get on the training set; the validation set accuracy will also improve for a while, but eventually it will start getting worse as the model starts to memorize the training set, rather than finding generalizable underlying patterns in the data. When this happens, we say that the model is *overfitting*.\n", "\n", - "<> shows what happens when you overfit, using a simplified example where we have just one parameter, and some randomly generated data based on the function `x**2`; as you see, although the predictions in the overfit model are accurate for data near the observed data, they are way off when outside of that range." + "<> shows what happens when you overfit, using a simplified example where we have just one parameter, and some randomly generated data based on the function `x**2`. As you can see, although the predictions in the overfit model are accurate for data near the observed data points, they are way off when outside of that range." ] }, { @@ -1537,42 +1551,42 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "**Overfitting is the single most important and challenging issue** when training for all machine learning practitioners, and all algorithms. As we will see, it is very easy to create a model that does a great job at making predictions on the exact data which it has been trained on, but it is much harder to make predictions on data that it has never seen before. And of course, this is the data that will actually matter in practice. For instance, if you create a hand-written digit classifier (as we will very soon!) and use it to recognise numbers written on cheques, then you are never going to see any of the numbers that the model was trained on -- every cheque will have slightly different variations of writing to deal with. We will learn many methods to avoid overfitting in this book. However, you should only use those methods after you have confirmed that overfitting is actually occurring (i.e. you have actually observed the validation accuracy getting worse during training). We often see practitioners using over-fitting avoidance techniques even when they have enough data that they didn't need to do so, ending up with a model that could be less accurate than what they could have achieved." + "**Overfitting is the single most important and challenging issue** when training for all machine learning practitioners, and all algorithms. As you will see, it is very easy to create a model that does a great job at making predictions on the exact data it has been trained on, but it is much harder to make accurate predictions on data the model has never seen before. And of course, this is the data that will actually matter in practice. For instance, if you create a handwritten digit classifier (as we will very soon!) and use it to recognize numbers written on checks, then you are never going to see any of the numbers that the model was trained on--check will have slightly different variations of writing to deal with. You will learn many methods to avoid overfitting in this book. However, you should only use those methods after you have confirmed that overfitting is actually occurring (i.e., you have actually observed the validation accuracy getting worse during training). We often see practitioners using over-fitting avoidance techniques even when they have enough data that they didn't need to do so, ending up with a model that may be less accurate than what they could have achieved." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> important: When you train a model, you must **always** have both a training set and a validation set, and must measure the accuracy of your model only on the validation set. If you train for too long, with not enough data, you will see the accuracy of your model start to get worse; this is called **over-fitting**. fastai defaults `valid_pct` to `0.2`, so even if you forget, fastai will create a validation set for you!" + "> important: Validation Set: When you train a model, you must _always_ have both a training set and a validation set, and must measure the accuracy of your model only on the validation set. If you train for too long, with not enough data, you will see the accuracy of your model start to get worse; this is called _overfitting_. fastai defaults `valid_pct` to `0.2`, so even if you forget, fastai will create a validation set for you!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ + "The fifth line of the code training our image recognizer tells fastai to create a *convolutional neural network* (CNN) and specifies what *architecture* to use (i.e. what kind of model to create), what data we want to train it on, and what *metric* to use:\n", + "\n", "```python\n", "learn = cnn_learner(dls, resnet34, metrics=error_rate)\n", "```\n", "\n", - "The fifth line tells fastai to create a *convolutional neural network* (CNN), and selects what *architecture* to use (i.e. what kind of model to create), what data we want to train it on, and what *metric* to use. \n", + "Why a CNN? It's the current state-of-the-art approach to creating computer vision models. We'll be learning all about how CNNs work in this book. Their structure is inspired by how the human vision system works.\n", "\n", - "Why a CNN? A CNN is the current state of the art approach to creating computer vision models. We'll be learning all about how they work in this book. Their structure is inspired by how the human vision system works.\n", + "There are many different architectures in fastai, which we will introduce in this book (as well as discussing how to create your own). Most of the time, however, picking an architecture isn't a very important part of the deep learning process. It's something that academics love to talk about, but in practice it is unlikely to be something you need to spend much time on. There are some standard architectures that work most of the time, and in this case we're using one called _ResNet_ that we'll be talking a lot about during the book; it is both fast and accurate for many datasets and problems. The `34` in `resnet34` refers to the number of layers in this variant of the architecture (other options are `18`, `50`, `101`, and `152`). Models using architectures with more layers take longer to train, and are more prone to overfitting (i.e. you can't train them for as many epochs before the accuracy on the validation set starts getting worse). On the other hand, when using more data, they can be quite a bit more accurate.\n", "\n", - "There are many different architectures in fastai, which we will be learning about in this book, as well as discussing how to create your own. Most of the time, however, picking an architecture isn't a very important part of the deep learning process. It's something that academics love to talk about, but in practice it is unlikely to be something you need to spend much time on. There are some standard architectures that work most of the time, and in this case we're using one called _ResNet_ that we'll be learning a lot about during the book; it is both fast and accurate for many datasets and problems. The \"34\" in `resnet34` refers to the number of layers in this variant of the architecture (other options are \"18\", \"50\", \"101\", and \"152\"). Models using architectures with more layers take longer to train, and are more prone to overfitting (i.e. you can't train them for as many epochs before the accuracy on the validation set starts getting worse). On the other hand, when using more data, they can be quite a bit more accurate.\n", + "What is a metric? A *metric* is a function that measures the quality of the model's predictions using the validation set, and will be printed at the end of each *epoch*. In this case, we're using `error_rate`, which is a function provided by fastai that does just what it says: tells you what percentage of images in the validation set are being classified incorrectly. Another common metric for classification is `accuracy` (which is just `1.0 - error_rate`). fastai provides many more, which will be discussed throughout this book.\n", "\n", - "What is a metric? A *metric* is a function that measures quality of the model's predictions using the validation set, and will be printed at the end of each *epoch*. In this case, we're using `error_rate`, which is a function provided by fastai which does just what it says: tells you what percentage of images in the validation set are being classified incorrectly. Another common metric for classification is `accuracy` (which is just `1.0 - error_rate`). fastai provides many more, which will be discussed throughout this book.\n", - "\n", - "The concept of a metric may remind you of loss, but there is an important distinction. The entire purpose of loss was to define a \"measure of performance\" that the training system could use to update weights automatically. In other words, a good choice for loss is a choice that is easy for stochastic gradient descent to use. But a metric is defined for human consumption. So a good metric is one that is easy for you to understand, and that hews as closely as possible to what you want the model to do. At times, you might decide that the loss function is a suitable metric, but that is not necessarily the case." + "The concept of a metric may remind you of *loss*, but there is an important distinction. The entire purpose of loss is to define a \"measure of performance\" that the training system can use to update weights automatically. In other words, a good choice for loss is a choice that is easy for stochastic gradient descent to use. But a metric is defined for human consumption, so a good metric is one that is easy for you to understand, and that hews as closely as possible to what you want the model to do. At times, you might decide that the loss function is a suitable metric, but that is not necessarily the case." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "`cnn_learner` also has a parameter `pretrained`, which defaults to `True` (so it's used in this case), which sets the weights in your model to values that have already been trained by experts to recognize a thousand different categories across 1.3 million photos (using the famous *ImageNet* dataset). A model that has weights that have already been trained on some other dataset is called a *pretrained model*. You should nearly always use a pretrained model, because it means that your model, before you've even shown it any of your data, is already very capable. And, as you'll see, in a deep learning model many of these capabilities are things you'll need, almost regardless of the details of your project. For instance, parts of pretrained models will handle edge-, gradient-, and color-detection, which are needed for many tasks.\n", + "`cnn_learner` also has a parameter `pretrained`, which defaults to `True` (so it's used in this case, even though we haven't specified it), which sets the weights in your model to values that have already been trained by experts to recognize a thousand different categories across 1.3 million photos (using the famous [*ImageNet* dataset](http://www.image-net.org/)). A model that has weights that have already been trained on some other dataset is called a *pretrained model*. You should nearly always use a pretrained model, because it means that your model, before you've even shown it any of your data, is already very capable. And, as you'll see, in a deep learning model many of these capabilities are things you'll need, almost regardless of the details of your project. For instance, parts of pretrained models will handle edge, gradient, and color detection, which are needed for many tasks.\n", "\n", - "When using a pretrained model, `cnn_learner` will remove the last layer, since that is always specifically customized to the original training task (i.e. ImageNet dataset classification), and replace it with one or more new layers with randomized weights, of an appropriate size for the dataset you are working with. This last part of the model is known as the `head`.\n", + "When using a pretrained model, `cnn_learner` will remove the last layer, since that is always specifically customized to the original training task (i.e. ImageNet dataset classification), and replace it with one or more new layers with randomized weights, of an appropriate size for the dataset you are working with. This last part of the model is known as the *head*.\n", "\n", "Using pretrained models is the *most* important method we have to allow us to train more accurate models, more quickly, with less data, and less time and money. You might think that would mean that using pretrained models would be the most studied area in academic deep learning... but you'd be very, very wrong! The importance of pretrained models is generally not recognized or discussed in most courses, books, or software library features, and is rarely considered in academic papers. As we write this at the start of 2020, things are just starting to change, but it's likely to take a while. So be careful: most people you speak to will probably greatly underestimate what you can do in deep learning with few resources, because they probably won't deeply understand how to use pretrained models.\n", "\n", @@ -1590,22 +1604,24 @@ "cell_type": "markdown", "metadata": {}, "source": [ + "The sixth line of our code tells fastai how to *fit* the model:\n", + "\n", "```python\n", "learn.fine_tune(1)\n", "```\n", "\n", - "The sixth line tells fastai how to *fit* the model. As we've discussed, the architecture only describes a *template* for a mathematical function; but it doesn't actually do anything until we provide values for the millions of parameters it contains.\n", + "As we've discussed, the architecture only describes a *template* for a mathematical function; it doesn't actually do anything until we provide values for the millions of parameters it contains.\n", "\n", - "This is the key to deep learning — how to fit the parameters of a model to get it to solve your problem. In order to fit a model, we have to provide at least one piece of information: how many times to look at each image (known as number of *epochs*). The number of epochs you select will largely depend on how much time you have available, and how long you find it takes in practice to fit your model. If you select a number that is too small, you can always train for more epochs later.\n", + "This is the key to deep learning—determining how to fit the parameters of a model to get it to solve your problem. In order to fit a model, we have to provide at least one piece of information: how many times to look at each image (known as number of *epochs*). The number of epochs you select will largely depend on how much time you have available, and how long you find it takes in practice to fit your model. If you select a number that is too small, you can always train for more epochs later.\n", "\n", - "But why is the method called `fine_tune`, and not `fit`? fastai actually *does* have a method called `fit`, which does indeed fit a model (i.e. look at images in the training set multiple times, each time updating the *parameters* to make the predictions closer and closer to the *target labels*). But in this case, we've started with a pretrained model, and we don't want to throw away all those capabilities that it already has. As we'll learn in this book, there are some important tricks to adapt a pretrained model for a new dataset -- a process called *fine-tuning*." + "But why is the method called `fine_tune`, and not `fit`? fastai actually *does* have a method called `fit`, which does indeed fit a model (i.e. look at images in the training set multiple times, each time updating the parameters to make the predictions closer and closer to the target labels). But in this case, we've started with a pretrained model, and we don't want to throw away all those capabilities that it already has. As you'll learn in this book, there are some important tricks to adapt a pretrained model for a new dataset--a process called *fine-tuning*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: Fine tuning: A transfer learning technique where the parameters of a pretrained model are updated by training for additional epochs using a different task to that used for pretraining." + "> jargon: Fine-tuning: A transfer learning technique where the parameters of a pretrained model are updated by training for additional epochs using a different task to that used for pretraining." ] }, { @@ -1614,8 +1630,8 @@ "source": [ "When you use the `fine_tune` method, fastai will use these tricks for you. There are a few parameters you can set (which we'll discuss later), but in the default form shown here, it does two steps:\n", "\n", - "1. Use one *epoch* to fit just those parts of the model necessary to get the new random *head* to work correctly with your dataset\n", - "1. Use the number of epochs requested when calling the method to fit the entire model, updating the weights of the later layers (especially the head) faster than the earlier layers (which, as we'll see, generally don't require many changes from the pretrained weights)\n", + "1. Use one epoch to fit just those parts of the model necessary to get the new random head to work correctly with your dataset.\n", + "1. Use the number of epochs requested when calling the method to fit the entire model, updating the weights of the later layers (especially the head) faster than the earlier layers (which, as we'll see, generally don't require many changes from the pretrained weights).\n", "\n", "The *head* of a model is the part that is newly added to be specific to the new dataset. An *epoch* is one complete pass through the dataset. After calling `fit`, the results after each epoch are printed, showing the epoch number, the training and validation set losses (the \"measure of performance\" used for training the model), and any *metrics* you've requested (error rate, in this case)." ] @@ -1631,44 +1647,44 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### What our image recognizer learned" + "### What Our Image Recognizer Learned" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "At this stage we have an image recogniser that is working very well, but we have no idea what it is actually doing! Although many people complain that deep learning results in impenetrable \"black box\" models (that is, something that gives predictions but that no one can understand), this really couldn't be further from the truth. There is a vast body of research showing how to deeply inspect deep learning models, and get rich insights from them. Having said that, all kinds of machine learning models (including deep learning, and traditional statistical models) can be challenging to fully understand, especially when considering how they will behave when coming across data that is very different to the data used to train them. We'll be discussing this issue throughout this book.\n", + "At this stage we have an image recognizer that is working very well, but we have no idea what it is actually doing! Although many people complain that deep learning results in impenetrable \"black box\" models (that is, something that gives predictions but that no one can understand), this really couldn't be further from the truth. There is a vast body of research showing how to deeply inspect deep learning models, and get rich insights from them. Having said that, all kinds of machine learning models (including deep learning, and traditional statistical models) can be challenging to fully understand, especially when considering how they will behave when coming across data that is very different to the data used to train them. We'll be discussing this issue throughout this book.\n", "\n", - "In 2013 a PhD student, Matt Zeiler, and his supervisor, Rob Fergus, published the paper [Visualizing and Understanding Convolutional Networks](https://arxiv.org/pdf/1311.2901.pdf), which showed how to visualise the neural network weights learned in each layer of a model. They carefully analysed the model that won the 2012 ImageNet competition, and used this analysis to greatly improve the model, such that they were able to go on to win the 2013 competition! <> is the picture that they published of the first layers' weights." + "In 2013 a PhD student, Matt Zeiler, and his supervisor, Rob Fergus, published the paper [\"Visualizing and Understanding Convolutional Networks\"](https://arxiv.org/pdf/1311.2901.pdf), which showed how to visualize the neural network weights learned in each layer of a model. They carefully analyzed the model that won the 2012 ImageNet competition, and used this analysis to greatly improve the model, such that they were able to go on to win the 2013 competition! <> is the picture that they published of the first layer's weights." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Activations" + "\"Activations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This picture requires some explanation. For each layer, the image part with the light grey background shows the reconstructed weights pictures, and the other section shows the parts of the training images which most strongly matched each set of weights. For layer 1, what we can see is that the model has discovered weights which represent diagonal, horizontal, and vertical edges, as well as various different gradients. (Note that for each layer only a subset of the features are shown; in practice there are thousands across all of the layers.) These are the basic building blocks that it has created automatically for computer vision. They have been widely analysed by neuroscientists and computer vision researchers, and it turns out that these learned building blocks are very similar to the basic visual machinery in the human eye, as well as the handcrafted computer vision features that were developed prior to the days of deep learning. The next layer is represented in <>." + "This picture requires some explanation. For each layer, the image part with the light gray background shows the reconstructed weights pictures, and the larger section at the bottom shows the parts of the training images that most strongly matched each set of weights. For layer 1, what we can see is that the model has discovered weights that represent diagonal, horizontal, and vertical edges, as well as various different gradients. (Note that for each layer only a subset of the features are shown; in practice there are thousands across all of the layers.) These are the basic building blocks that the model has learned for computer vision. They have been widely analyzed by neuroscientists and computer vision researchers, and it turns out that these learned building blocks are very similar to the basic visual machinery in the human eye, as well as the handcrafted computer vision features that were developed prior to the days of deep learning. The next layer is represented in <>." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Activations" + "\"Activations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "For layer 2, there are nine examples of weight reconstructions for each of the features found by the model. We can see that the model has learned to create feature detectors that look for corners, repeating lines, circles, and other simple patterns. These are built from the basic building blocks developed in the first layer. For each of these, the right-hand side of the picture shows small patches from actual images which these features most closely match. For instance, the particular pattern in row 2 column 1 matches the gradients and textures associated with sunsets.\n", + "For layer 2, there are nine examples of weight reconstructions for each of the features found by the model. We can see that the model has learned to create feature detectors that look for corners, repeating lines, circles, and other simple patterns. These are built from the basic building blocks developed in the first layer. For each of these, the right-hand side of the picture shows small patches from actual images which these features most closely match. For instance, the particular pattern in row 2, column 1 matches the gradients and textures associated with sunsets.\n", "\n", "<> shows the image from the paper showing the results of reconstructing the features of layer 3." ] @@ -1677,46 +1693,46 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\"Activations" + "\"Activations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see by looking at the right-hand side of this picture, the features are now able to identify and match with higher-level semantic components, such as car wheels, text, and flower petals. Using these components, layers four and five can identify even higher-level concepts, as shown in <>." + "As you can see by looking at the righthand side of this picture, the features are now able to identify and match with higher-level semantic components, such as car wheels, text, and flower petals. Using these components, layers four and five can identify even higher-level concepts, as shown in <>." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Activations" + "\"Activations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This article was studying an older model called `AlexNet` that only contained five layers. Networks developed since then can have hundreds of layers--so you can imagine how rich the features developed by these models can be! \n", + "This article was studying an older model called *AlexNet* that only contained five layers. Networks developed since then can have hundreds of layers--so you can imagine how rich the features developed by these models can be! \n", "\n", - "When we fine-tuned our pretrained model earlier, we adapted what those last layers focus on (flowers, humans, animals) to specialize on the cats versus dogs problem. More generally, we could specialize such a pretrained problem on many different tasks. Let's have a look at some examples. " + "When we fine-tuned our pretrained model earlier, we adapted what those last layers focus on (flowers, humans, animals) to specialize on the cats versus dogs problem. More generally, we could specialize such a pretrained model on many different tasks. Let's have a look at some examples. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Image recognizers can tackle non-image tasks" + "### Image Recognizers Can Tackle Non-Image Tasks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "An image recogniser can, as its name suggests, only recognise images. But a lot of things can be represented as images, which means that an image recogniser can learn to complete many tasks.\n", + "An image recognizer can, as its name suggests, only recognize images. But a lot of things can be represented as images, which means that an image recogniser can learn to complete many tasks.\n", "\n", - "For instance, a sound can be converted to a spectrogram, which is a chart that shows the amount of each frequency at each time in an audio file. Fast.ai student Ethan Sutin used this approach to easily beat the published accuracy on [environmental sound detection](https://medium.com/@etown/great-results-on-audio-classification-with-fastai-library-ccaf906c5f52) using a dataset of 8732 urban sounds. fastai's `show_batch` clearly shows how each different sound has a quite distinctive spectrogram, as you can see in <>." + "For instance, a sound can be converted to a spectrogram, which is a chart that shows the amount of each frequency at each time in an audio file. Fast.ai student Ethan Sutin used this approach to easily beat the published accuracy of a state-of-the-art [environmental sound detection model](https://medium.com/@etown/great-results-on-audio-classification-with-fastai-library-ccaf906c5f52) using a dataset of 8,732 urban sounds. fastai's `show_batch` clearly shows how each different sound has a quite distinctive spectrogram, as you can see in <>." ] }, { @@ -1730,7 +1746,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Time series can be easily converted into an image by simply plotting the time series in a graph. However, it is often a good idea to try to represent your data in a way that makes it as easy as possible to pull out the most important components. In a time-series, things like seasonality and anomalies are most likely to be of interest. There are various transformations available for time series data; for instance, fast.ai student Ignacio Oguiza created images from a time series data set for olive oil classification. He used a technique called Gramian Angular Field (GAF), and you can see the result in <>. He then fed those images to an image classification model just like the one you see in this chapter. His results, despite having only 30 training set images, were well over 90% accurate, and close to the state-of-the-art." + "A time series can easily be converted into an image by simply plotting the time series on a graph. However, it is often a good idea to try to represent your data in a way that makes it as easy as possible to pull out the most important components. In a time series, things like seasonality and anomalies are most likely to be of interest. There are various transformations available for time series data. For instance, fast.ai student Ignacio Oguiza created images from a time series dataset for olive oil classification, using a technique called Gramian Angular Difference Field (GADF); you can see the result in <>. He then fed those images to an image classification model just like the one you see in this chapter. His results, despite having only 30 training set images, were well over 90% accurate, and close to the state of the art." ] }, { @@ -1744,7 +1760,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Another interesting fast.ai student project example comes from Gleb Esman. He was working on fraud detection at Splunk, and was working with a dataset of users' mouse movements and mouse clicks. He turned these into pictures by drawing an image where the position, speed and acceleration of the mouse was displayed using coloured lines, and the clicks were displayed using [small coloured circles](https://www.splunk.com/en_us/blog/security/deep-learning-with-splunk-and-tensorflow-for-security-catching-the-fraudster-in-neural-networks-with-behavioral-biometrics.html) as shown in <>. He then fed this into an image recognition model just like the one we've shown in this chapter, and it worked so well that it led to a patent for this approach to fraud analytics!" + "Another interesting fast.ai student project example comes from Gleb Esman. He was working on fraud detection at Splunk, using a dataset of users' mouse movements and mouse clicks. He turned these into pictures by drawing an image where the position, speed, and acceleration of the mouse pointer was displayed using coloured lines, and the clicks were displayed using [small colored circles](https://www.splunk.com/en_us/blog/security/deep-learning-with-splunk-and-tensorflow-for-security-catching-the-fraudster-in-neural-networks-with-behavioral-biometrics.html), as shown in <>. He then fed this into an image recognition model just like the one we've used in this chapter, and it worked so well that it led to a patent for this approach to fraud analytics!" ] }, { @@ -1758,7 +1774,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Another example comes from the paper [Malware Classification with Deep Convolutional Neural Networks](https://ieeexplore.ieee.org/abstract/document/8328749) which explains that \"the malware binary file is divided into 8-bit sequences which are then converted to equivalent decimal values. This decimal vector is reshaped and gray-scale image is generated that represent the malware sample\", like in <>." + "Another example comes from the paper [\"Malware Classification with Deep Convolutional Neural Networks\"](https://ieeexplore.ieee.org/abstract/document/8328749) by Mahmoud Kalash et al., which explains that \"the malware binary file is divided into 8-bit sequences which are then converted to equivalent decimal values. This decimal vector is reshaped and a gray-scale image is generated that represents the malware sample,\" like in <>." ] }, { @@ -1772,7 +1788,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "They then show \"pictures\" generated through this process of malware in different categories, as shown in <>." + "The authors then show \"pictures\" generated through this process of malware in different categories, as shown in <>." ] }, { @@ -1786,23 +1802,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see, the different types of malware look very distinctive to the human eye. The model they trained based on this image representation was more accurate at malware classification than any previous approach shown in the academic literature. This suggests a good rule of thumb for converting a dataset into an image representation: if the human eye can recognize categories from the images, then a deep learning model should be able to do so too.\n", + "As you can see, the different types of malware look very distinctive to the human eye. The model the researchers trained based on this image representation was more accurate at malware classification than any previous approach shown in the academic literature. This suggests a good rule of thumb for converting a dataset into an image representation: if the human eye can recognize categories from the images, then a deep learning model should be able to do so too.\n", "\n", - "In general, you'll find that a small number of general approaches in deep learning can go a long way, if you're a bit creative in how you represent your data! You shouldn't think of approaches like the above as \"hacky workarounds\", since actually they often (as here) beat previously state of the art results. These really are the right way to think about these problem domains." + "In general, you'll find that a small number of general approaches in deep learning can go a long way, if you're a bit creative in how you represent your data! You shouldn't think of approaches like the ones described here as \"hacky workarounds,\" because actually they often (as here) beat previously state-of-the-art results. These really are the right ways to think about these problem domains." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Jargon recap" + "### Jargon Recap" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We just covered a lot of information so let's recap briefly. <> provides a handy list.\n", + "We just covered a lot of information so let's recap briefly, <> provides a handy vocabulary.\n", "\n", "```asciidoc\n", "[[dljargon]]\n", @@ -1810,21 +1826,21 @@ "[options=\"header\"]\n", "|=====\n", "| Term | Meaning\n", - "|**label** | The data that we're trying to predict, such as \"dog\" or \"cat\"\n", - "|**architecture** | The _template_ of the model that we're trying to fit; the actual mathematical function that we're passing the input data and parameters to\n", - "|**model** | the combination of the architecture with a particular set of parameters\n", - "|**parameters** | the values in the model that change what task it can do, and are updated through model training\n", - "|**fit** | Update the parameters of the model such that the predictions of the model using the input data match the target labels\n", - "|**train** | A synonym for _fit_\n", - "|**pretrained model** | A model that has already been trained, generally using a large dataset, and will be fine-tuned\n", - "|**fine tune** | Update a pretrained model for a different task\n", - "|**epoch** | One complete pass through the input data\n", - "|**loss** | A measure of how good the model is, chosen to drive training via SGD\n", - "|**metric** | A measurement of how good the model is, using the validation set, chosen for human consumption\n", - "|**validation set** | A set of data held out from training, used only for measuring how good the model is\n", - "|**training set** | The data used for fitting the model; does not include any data from the validation set\n", - "|**overfitting** | Training a model in such a way that it _remembers_ specific features of the input data, rather than generalizing well to data not seen during training\n", - "|**CNN** | Convolutional neural network; a type of neural network that works particularly well for computer vision tasks\n", + "|Label | The data that we're trying to predict, such as \"dog\" or \"cat\"\n", + "|Architecture | The _template_ of the model that we're trying to fit; the actual mathematical function that we're passing the input data and parameters to\n", + "|Model | The combination of the architecture with a particular set of parameters\n", + "|Parameters | The values in the model that change what task it can do, and are updated through model training\n", + "|Fit | Update the parameters of the model such that the predictions of the model using the input data match the target labels\n", + "|Train | A synonym for _fit_\n", + "|Pretrained model | A model that has already been trained, generally using a large dataset, and will be fine-tuned\n", + "|Fine-tune | Update a pretrained model for a different task\n", + "|Epoch | One complete pass through the input data\n", + "|Loss | A measure of how good the model is, chosen to drive training via SGD\n", + "|Metric | A measurement of how good the model is, using the validation set, chosen for human consumption\n", + "|Validation set | A set of data held out from training, used only for measuring how good the model is\n", + "|Training set | The data used for fitting the model; does not include any data from the validation set\n", + "|Overfitting | Training a model in such a way that it _remembers_ specific features of the input data, rather than generalizing well to data not seen during training\n", + "|CNN | Convolutional neural network; a type of neural network that works particularly well for computer vision tasks\n", "|=====\n", "```" ] @@ -1833,33 +1849,33 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "With this vocabulary in hand, we are now in a position to bring together all the key concepts so far. Take a moment to review those definitions and read the following summary. If you can follow the explanation, then you have laid down the basic coordinates for understanding many discussions to come.\n", + "With this vocabulary in hand, we are now in a position to bring together all the key concepts introduced so far. Take a moment to review those definitions and read the following summary. If you can follow the explanation, then you're well equipped to understand the discussions to come.\n", "\n", - "*Machine learning* is a discipline where we define a program not by writing it entirely ourselves, but by learning from data. *Deep learning* is a specialty within machine learning which uses *neural networks* using multiple *layers*. *Image classification* is a representative example (also known as *image recognition*). We start with *labeled data*, that is, a set of images where we have assigned a *label* to each image indicating what it represents. Our goal is to produce a program, called a *model*, which, given a new image, will make an accurate *prediction* regarding what that new image represents.\n", + "*Machine learning* is a discipline where we define a program not by writing it entirely ourselves, but by learning from data. *Deep learning* is a specialty within machine learning that uses *neural networks* with multiple *layers*. *Image classification* is a representative example (also known as *image recognition*). We start with *labeled data*; that is, a set of images where we have assigned a *label* to each image indicating what it represents. Our goal is to produce a program, called a *model*, which, given a new image, will make an accurate *prediction* regarding what that new image represents.\n", "\n", - "Every model starts with a choice of *architecture*, a general template for how that kind of model works internally. The process of *training* (or *fitting*) the model is the process of finding a set of *parameter values* (or *weights*) which specializes that general architecture into a model that works well for our particular kind of data. In order to define how well a model does on a single prediction, we need to define a *loss function*, which defines how we score a prediction as good or bad, in order to support training.\n", + "Every model starts with a choice of *architecture*, a general template for how that kind of model works internally. The process of *training* (or *fitting*) the model is the process of finding a set of *parameter values* (or *weights*) that specialize that general architecture into a model that works well for our particular kind of data. In order to define how well a model does on a single prediction, we need to define a *loss function*, which determines how we score a prediction as good or bad.\n", "\n", - "In order to make the training process go faster, we might start with a *pretrained model*, a model which has already been trained on someone else's data. We then adapt it to our data by training it a bit more on our data, a process called *fine tuning*.\n", + "To make the training process go faster, we might start with a *pretrained model*--a model that has already been trained on someone else's data. We can then adapt it to our data by training it a bit more on our data, a process called *fine-tuning*.\n", "\n", - "When we train a model, a key concern is to ensure that our model *generalizes* -- that is, that it learns general lessons from our data which also apply to new items it will encounter, so that it can make good predictions on those items. The risk is that if we train our model badly, instead of learning general lessons it effectively memorizes what it has already seen, and then it will make poor predictions about new images. Such a failure is called *overfitting*. In order to avoid this, we always divide our data into two parts, the *training set* and the *validation set*. We train the model by showing it only the *training set* and then we evaluate how well the model is doing by seeing how well it predicts on items from the *validation set* . In this way, we check if the lessons the model learns from the training set are lessons that generalize to the validation set. In order for a person to assess how well the model is doing on the validation set overall, we define a *metric* . During the training process, when the model has seen every item in the training set, we call that an *epoch*.\n", + "When we train a model, a key concern is to ensure that our model *generalizes*--that is, that it learns general lessons from our data which also apply to new items it will encounter, so that it can make good predictions on those items. The risk is that if we train our model badly, instead of learning general lessons it effectively memorizes what it has already seen, and then it will make poor predictions about new images. Such a failure is called *overfitting*. In order to avoid this, we always divide our data into two parts, the *training set* and the *validation set*. We train the model by showing it only the training set and then we evaluate how well the model is doing by seeing how well it performs on items from the validation set. In this way, we check if the lessons the model learns from the training set are lessons that generalize to the validation set. In order for a person to assess how well the model is doing on the validation set overall, we define a *metric*. During the training process, when the model has seen every item in the training set, we call that an *epoch*.\n", "\n", - "All these concepts apply to machine learning in general. That is, they apply to all sorts of schemes for defining a model by training it with data. What makes deep learning distinctive is a particular class of architectures, the architectures based on *neural networks*. In particular, tasks like image classification rely heavily on *convolutional neural networks*, which we will discuss shortly." + "All these concepts apply to machine learning in general. That is, they apply to all sorts of schemes for defining a model by training it with data. What makes deep learning distinctive is a particular class of architectures: the architectures based on *neural networks*. In particular, tasks like image classification rely heavily on *convolutional neural networks*, which we will discuss shortly." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Deep learning is not just for image classification" + "## Deep Learning Is Not Just for Image Classification" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Deep learning's effectiveness for classifying images has been widely discussed in recent years, even showing _super-human_ results on complex tasks like recognizing malignant tumours in CT scans. But it can do a lot more than this, as we will show here.\n", + "Deep learning's effectiveness for classifying images has been widely discussed in recent years, even showing _superhuman_ results on complex tasks like recognizing malignant tumors in CT scans. But it can do a lot more than this, as we will show here.\n", "\n", - "For instance, let's talk about something that is critically important for autonomous vehicles: localising objects in a picture. If a self-driving car doesn't know where a pedestrian is, then it doesn't know how to avoid one! Creating a model which can recognize the content of every individual pixel in an image is called *segmentation*. Here is how we can train a segmentation model using fastai, using a subset of the *Camvid* dataset from the paper [Semantic Object Classes in Video: A High-Definition Ground Truth Database](http://mi.eng.cam.ac.uk/research/projects/VideoRec/CamVid/):" + "For instance, let's talk about something that is critically important for autonomous vehicles: localizing objects in a picture. If a self-driving car doesn't know where a pedestrian is, then it doesn't know how to avoid one! Creating a model that can recognize the content of every individual pixel in an image is called *segmentation*. Here is how we can train a segmentation model with fastai, using a subset of the [*Camvid* dataset](http://www0.cs.ucl.ac.uk/staff/G.Brostow/papers/Brostow_2009-PRL.pdf) from the paper \"Semantic Object Classes in Video: A High-Definition Ground Truth Database\" by Gabruel J. Brostow, Julien Fauqueur, and Roberto Cipolla:" ] }, { @@ -1984,9 +2000,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We are not even going to walk through this code line by line, because it is nearly identical to our previous example! (Although we will, of course, be doing a deep dive into segmentation models in <>, along with all of the other models that we are briefly introducing in this chapter, and many, many more.)\n", + "We are not even going to walk through this code line by line, because it is nearly identical to our previous example! (Although we will be doing a deep dive into segmentation models in <>, along with all of the other models that we are briefly introducing in this chapter, and many, many more.)\n", "\n", - "We can visualise how well it achieved its task, by asking the model to color code each pixel of an image. As you can see, it nearly perfectly classifies every pixel in every object; for instance, notice that all of the cars are overlaid with the same colour, and all of the trees are overlaid with the same color (in each pair of images, the left hand image is the ground truth labels, the right hand is the predictions from the model):" + "We can visualize how well it achieved its task, by asking the model to color-code each pixel of an image. As you can see, it nearly perfectly classifies every pixel in every object. For instance, notice that all of the cars are overlaid with the same color and all of the trees are overlaid with the same color (in each pair of images, the lefthand image is the ground truth label and the right is the prediction from the model):" ] }, { @@ -2025,7 +2041,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "One other area where deep learning has dramatically improved in the last couple of years is natural language processing (NLP). Computers can now generate text, translate automatically from one language to another, analyze comments, label words in sentences, and much more. Here is all of the code necessary to train a model which can classify the sentiment of a movie review better than anything that existed in the world just five years ago:" + "One other area where deep learning has dramatically improved in the last couple of years is natural language processing (NLP). Computers can now generate text, translate automatically from one language to another, analyze comments, label words in sentences, and much more. Here is all of the code necessary to train a model that can classify the sentiment of a movie review better than anything that existed in the world just five years ago:" ] }, { @@ -2157,7 +2173,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This model is using the IMDb dataset from the paper [Learning Word Vectors for Sentiment Analysis](https://ai.stanford.edu/~amaas/data/sentiment/). It works well with movie reviews of many thousands of words. But let's test it out on a very short one, to see it does its thing:" + "This model is using the [\"IMDb Large Movie Review dataset\"](https://ai.stanford.edu/~ang/papers/acl11-WordVectorsSentimentAnalysis.pdf) from the paper \"Learning Word Vectors for Sentiment Analysis\" by Andrew Maas et al. It works well with movie reviews of many thousands of words, but let's test it out on a very short one to see how it does its thing:" ] }, { @@ -2196,21 +2212,21 @@ "source": [ "Here we can see the model has considered the review to be positive. The second part of the result is the index of \"pos\" in our data vocabulary and the last part is the probabilities attributed to each class (99.6% for \"pos\" and 0.4% for \"neg\"). \n", "\n", - "Now it's your turn! Write your own mini movie review, or copy one from the Internet, and we can see what this model thinks about it. " + "Now it's your turn! Write your own mini movie review, or copy one from the internet, and you can see what this model thinks about it. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: The order matters" + "### Sidebar: The Order Matters" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In a Jupyter notebook, the order in which you execute each cell is very important. It's not like Excel, where everything gets updated as soon as you type something anywhere, but it has an inner state that gets updated each time you execute a cell. For instance, when you run the first cell of the notebook (with the CLICK ME comment), you create an object `learn` that contains a model and data for an image classification problem. If we were to run the cell right above (the one that predicts if a review is good or not) straight after, we would get an error as this `learn` object does not contain a text classification model. This cell needs to be run after the one containing \n", + "In a Jupyter notebook, the order in which you execute each cell is very important. It's not like Excel, where everything gets updated as soon as you type something anywhere--it has an inner state that gets updated each time you execute a cell. For instance, when you run the first cell of the notebook (with the \"CLICK ME\" comment), you create an object called `learn` that contains a model and data for an image classification problem. If we were to run the cell just shown in the text (the one that predicts if a review is good or not) straight after, we would get an error as this `learn` object does not contain a text classification model. This cell needs to be run after the one containing:\n", "\n", "```python\n", "from fastai2.text.all import *\n", @@ -2221,11 +2237,11 @@ "learn.fine_tune(4, 1e-2)\n", "```\n", "\n", - "The outputs themselves can be deceiving: they have the results of the last time the cell was executed, but if you change the code inside a cell without executing it, the old (misleading) results will remain.\n", + "The outputs themselves can be deceiving, becaue they include the results of the last time the cell was executed; if you change the code inside a cell without executing it, the old (misleading) results will remain.\n", "\n", - "Except when we mention it explicitly, the notebooks provided on the book website are meant to be run in order, from top to bottom. In general, when experimenting, you will find yourself executing cells in any order to go fast (which is a super neat feature of Jupyter Notebooks) but once you have explored and arrive at the final version of your code, make sure you can run the cells of your notebooks in order (your future self won't necessarily remember the convoluted path you took otherwise!). \n", + "Except when we mention it explicitly, the notebooks provided on the [book website](https://book.fast.ai/) are meant to be run in order, from top to bottom. In general, when experimenting, you will find yourself executing cells in any order to go fast (which is a super neat feature of Jupyter Notebook), but once you have explored and arrived at the final version of your code, make sure you can run the cells of your notebooks in order (your future self won't necessarily remember the convoluted path you took otherwise!). \n", "\n", - "In command mode, pressing `0` twice will restart the *kernel* (which is the engine powering your notebook). This will wipe your state clean and make it as if you had just started in the notebook. Click on the \"Cell\" menu and then on \"Run All Above\" to run all cells above the point where you are. We have found this to be very useful when developing the fastai library." + "In command mode, pressing `0` twice will restart the *kernel* (which is the engine powering your notebook). This will wipe your state clean and make it as if you had just started in the notebook. Choose Run All Above from the Cell menu to run all cells above the point where you are. We have found this to be very useful when developing the fastai library." ] }, { @@ -2239,7 +2255,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If you ever have any questions about a fastai method, you should use the function `doc`:\n", + "If you ever have any questions about a fastai method, you should use the function `doc`, passing it the method name:\n", "\n", "```python\n", "doc(learn.predict)\n", @@ -2254,16 +2270,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "A brief one-line explanation is provided by `doc`. The *show in docs* link is where you'll find all the details in the [full documentation](https://docs.fast.ai/), and lots of examples. Also, most of fastai's methods are just a handful of lines, so you can click the *source* link to see exactly what's going on behind the scenes.\n", + "A brief one-line explanation is provided by `doc`. The \"Show in docs\" link take tou to the full documentation, where you'll find all the details and lots of examples. Also, most of fastai's methods are just a handful of lines, so you can click the \"source\" link to see exactly what's going on behind the scenes.\n", "\n", - "Let's move on to something much less sexy, but perhaps significantly more widely commercially useful: building models from plain *tabular* data. It turns out that looks very similar too. Here is the code necessary to train a model which will predict whether a person is a high-income earner, based on their socio-economic background:" + "Let's move on to something much less sexy, but perhaps significantly more widely commercially useful: building models from plain *tabular* data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "> jargon: Tabular: Data that is in the form of a table, such as from a spreadsheet, database, or CSV file. A tabular model is a model that tries to predict one column of a table based on information in other columns of the table." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: Tabular: Data that is in the form of a table, such as from a spreadsheet, database, or CSV file. A tabular model is a model which tries to predict one column of a table based on information in other columns of a table." + "It turns out that looks very similar too. Here is the code necessary to train a model that will predict whether a person is a high-income earner, based on their socioeconomic background:" ] }, { @@ -2288,9 +2311,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you see, we had to tell fastai which columns are *categorical* (that is, they contain values that are one of a discrete set of choices, such as `occupation`), versus *continuous* (that is, they contain a number that represents a quantity, such as `age`).\n", + "As you see, we had to tell fastai which columns are *categorical* (that is, contain values that are one of a discrete set of choices, such as `occupation`) and which are *continuous* (that is, contain a number that represents a quantity, such as `age`).\n", "\n", - "There is no pretrained model available for this task (in general, pretrained models are not widely available for any tabular modeling tasks, although some organizations have created them for internal use), so we don't use `fine_tune` in this case, but instead `fit_one_cycle`, the most commonly used method for training fastai models *from scratch* (i.e. without transfer learning):" + "There is no pretrained model available for this task (in general, pretrained models are not widely available for any tabular modeling tasks, although some organizations have created them for internal use), so we don't use `fine_tune` in this case. Instead we use `fit_one_cycle`, the most commonly used method for training fastai models *from scratch* (i.e. without transfer learning):" ] }, { @@ -2352,9 +2375,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This model is using the *adult* dataset, from the paper [Scaling Up the Accuracy of Naive-Bayes Classifiers: a Decision-Tree Hybrid](https://archive.ics.uci.edu/ml/datasets/adult), which contains some data regarding individuals (like their education, marital status, race, sex, etc.) and whether or not they have an annual income greater than \\$50k. The model is over 80\\% accurate, and took around 30 seconds to train.\n", - "\n", - "Let's look at one more. Recommendation systems are very important, particularly in e-commerce. Companies like Amazon and Netflix try hard to recommend products or movies which you might like. Here's how to train a model which will predict which people might like which movie, based on their previous viewing habits, using the [MovieLens dataset](https://doi.org/10.1145/2827872):" + "This model is using the [*Adult* dataset](http://robotics.stanford.edu/~ronnyk/nbtree.pdf), from the paper \"Scaling Up the Accuracy of Naive-Bayes Classifiers: a Decision-Tree Hybrid\" by Rob Kohavi, which contains some demographic data about individuals (like their education, marital status, race, sex, and whether or not they have an annual income greater than \\$50k). The model is over 80\\% accurate, and took around 30 seconds to train." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's look at one more. Recommendation systems are very important, particularly in e-commerce. Companies like Amazon and Netflix try hard to recommend products or movies that users might like. Here's how to train a model that will predict movies people might like, based on their previous viewing habits, using the [MovieLens dataset](https://doi.org/10.1145/2827872):" ] }, { @@ -2489,7 +2517,7 @@ "source": [ "This model is predicting movie ratings on a scale of 0.5 to 5.0 to within around 0.6 average error. Since we're predicting a continuous number, rather than a category, we have to tell fastai what range our target has, using the `y_range` parameter.\n", "\n", - "Although we're not actually using a pretrained model (for the same reason that we didn't for the tabular model), this example shows that fastai lets us use `fine_tune` even in this case (we'll learn how and why this works later in <>). Sometimes it's best to experiment with `fine_tune` versus `fit_one_cycle` to see which works best for your dataset.\n", + "Although we're not actually using a pretrained model (for the same reason that we didn't for the tabular model), this example shows that fastai lets us use `fine_tune` anyway in this case (you'll learn how and why this works in <>). Sometimes it's best to experiment with `fine_tune` versus `fit_one_cycle` to see which works best for your dataset.\n", "\n", "We can use the same `show_results` call we saw earlier to view a few examples of user and movie IDs, actual ratings, and predictions:" ] @@ -2612,22 +2640,22 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Datasets: food for models" + "### Sidebar: Datasets: Food for Models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "You’ve already seen in this section quite a few models, each one trained using a different dataset, to do a different task. In machine learning and deep learning, we can’t do anything without data. So, the people that create datasets for us to train our models are the (often under-appreciated) heroes. Some of the most useful and important datasets are those that become important *academic baselines*; that is, datasets that are widely studied by researchers and used to compare algorithmic changes. Some of these become household names (at least, among households that train models!), such as MNIST, CIFAR 10, and ImageNet.\n", + "You’ve already seen quite a few models in this section, each one trained using a different dataset to do a different task. In machine learning and deep learning, we can’t do anything without data. So, the people that create datasets for us to train our models on are the (often underappreciated) heroes. Some of the most useful and important datasets are those that become important *academic baselines*; that is, datasets that are widely studied by researchers and used to compare algorithmic changes. Some of these become household names (at least, among households that train models!), such as MNIST, CIFAR-10, and ImageNet.\n", "\n", - "The datasets used in this book have been selected because they provide great examples of the kind of data that you are likely to encounter, and the academic literature has many examples of model results using these datasets which you can compare your work to.\n", + "The datasets used in this book have been selected because they provide great examples of the kinds of data that you are likely to encounter, and the academic literature has many examples of model results using these datasets to which you can compare your work.\n", "\n", - "Most datasets used in this book took the creators a lot of work to build. For instance, later in the book we’ll be showing you how to create a model that can translate between French and English. The key input to this is a French/English parallel text corpus prepared back in 2009 by Professor Chris Callison-Burch of the University of Pennsylvania. This dataset contains over 20 million sentence pairs in French and English. He built the dataset in a really clever way: by crawling millions of Canadian web pages (which are often multi-lingual) and then using a set of simple heuristics to transform French URLs onto English URLs.\n", + "Most datasets used in this book took the creators a lot of work to build. For instance, later in the book we’ll be showing you how to create a model that can translate between French and English. The key input to this is a French/English parallel text corpus prepared back in 2009 by Professor Chris Callison-Burch of the University of Pennsylvania. This dataset contains over 20 million sentence pairs in French and English. He built the dataset in a really clever way: by crawling millions of Canadian web pages (which are often multilingual) and then using a set of simple heuristics to transform URLs of French content onto URLs pointing to the same content in English.\n", "\n", - "As you look at datasets throughout this book, think about where they might have come from, and how they might have been curated. Then, think about what kinds of interesting datasets you could create for your own projects. (We’ll even take you step by step through the process of creating your own image dataset soon.)\n", + "As you look at datasets throughout this book, think about where they might have come from, and how they might have been curated. Then think about what kinds of interesting datasets you could create for your own projects. (We’ll even take you step by step through the process of creating your own image dataset soon.)\n", "\n", - "fast.ai has spent a lot of time creating cutdown versions of popular datasets that are specially designed to support rapid prototyping and experimentation, and to be easier to learn with. In this book we will often start by using one of the cutdown versions, and we later on scale up to the full-size version (just as we're doing in this chapter!) In fact, this is how the world’s top practitioners do their modelling projects in practice; they do most of their experimentation and prototyping with subsets of their data, and only use the full dataset when they have a good understanding of what they have to do." + "fast.ai has spent a lot of time creating cut-down versions of popular datasets that are specially designed to support rapid prototyping and experimentation, and to be easier to learn with. In this book we will often start by using one of the cut-down versions and later scale up to the full-size version (just as we're doing in this chapter!). In fact, this is how the world’s top practitioners do their modeling in practice; they do most of their experimentation and prototyping with subsets of their data, and only use the full dataset when they have a good understanding of what they have to do." ] }, { @@ -2641,14 +2669,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Each of the models we trained showed a training and validation loss. A good validation set is one of the most important pieces of your training. Let's see why and learn how to create one." + "Each of the models we trained showed a training and validation loss. A good validation set is one of the most important pieces of the training process. Let's see why and learn how to create one." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Validation sets and test sets" + "## Validation Sets and Test Sets" ] }, { @@ -2657,37 +2685,37 @@ "source": [ "As we've discussed, the goal of a model is to make predictions about data. But the model training process is fundamentally dumb. If we trained a model with all our data, and then evaluated the model using that same data, we would not be able to tell how well our model can perform on data it hasn’t seen. Without this very valuable piece of information to guide us in training our model, there is a very good chance it would become good at making predictions about that data but would perform poorly on new data.\n", "\n", - "It is in order to avoid this that our first step was to split our dataset into two sets, the *training set* (which our model sees in training) and the *validation set*, also known as the *development set* (which is used only for evaluation). This lets us test that the model learns lessons from the training data which generalize to new data, the validation data.\n", + "To avoid this, our first step was to split our dataset into two sets: the *training set* (which our model sees in training) and the *validation set*, also known as the *development set* (which is used only for evaluation). This lets us test that the model learns lessons from the training data that generalize to new data, the validation data.\n", "\n", - "One way to understand this situation is that, in a sense, we don't want our model to get good results by \"cheating\". If it predicts well on a data item, that should be because it has learned principles that govern that kind of item, and not because the model has been shaped by *actually having seen that particular item*.\n", + "One way to understand this situation is that, in a sense, we don't want our model to get good results by \"cheating.\" If it makes an accurate prediction for a data item, that should be because it has learned characteristics of that kind of item, and not because the model has been shaped by *actually having seen that particular item*.\n", "\n", - "Splitting off our validation data means our model never sees it in training, and so is completely untainted by it, and is not cheating in any way. Right?\n", + "Splitting off our validation data means our model never sees it in training and so is completely untainted by it, and is not cheating in any way. Right?\n", "\n", - "In fact, not necessarily. The situation is more subtle. The subtlety is that in realistic scenarios we rarely build a model just by training its weight parameters once. Instead we are likely to explore many versions of a model through various modelling choices regarding network architecture, learning rates, data augmentation strategies, and other factors we will discuss in upcoming chapters. Many of these choices can be described as choices of *hyperparameters*. The word reflects that they are parameters about parameters, since they are the higher-level choices that govern the meaning of the weight parameters." + "In fact, not necessarily. The situation is more subtle. This is because in realistic scenarios we rarely build a model just by training its weight parameters once. Instead, we are likely to explore many versions of a model through various modeling choices regarding network architecture, learning rates, data augmentation strategies, and other factors we will discuss in upcoming chapters. Many of these choices can be described as choices of *hyperparameters*. The word reflects that they are parameters about parameters, since they are the higher-level choices that govern the meaning of the weight parameters." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The problem is that, even though the ordinary training process is only looking at predictions on the training data when it learns values for the weight parameters, the same is not true about us. We, as modellers, are evaluating the model by looking at predictions on the validation data, when we decide to explore new hyperparameter values! So subsequent versions of the model are, indirectly, shaped by having seen the validation data. Just as the automatic training process is in danger of overfitting the training data, we are in danger of overfitting the validation data, by human trial and error and exploration.\n", + "The problem is that even though the ordinary training process is only looking at predictions on the training data when it learns values for the weight parameters, the same is not true of us. We, as modelers, are evaluating the model by looking at predictions on the validation data when we decide to explore new hyperparameter values! So subsequent versions of the model are, indirectly, shaped by us having seen the validation data. Just as the automatic training process is in danger of overfitting the training data, we are in danger of overfitting the validation data through human trial and error and exploration.\n", "\n", - "The solution to this conundrum is to introduce another level of even more highly reserved data, the \"test set\". Just as we hold back the validation data from the training process, we must hold back the test set data even from ourselves. It cannot be used to improve the model; it can only be used to evaluate the model at the very end of our efforts. In effect, we define a hierarchy of cuts of our data, based on how fully we want to hide it from training and modelling processes -- training data is fully exposed, the validation data is less exposed, and test data is totally hidden. This hierarchy parallels the different kinds of modelling and evaluation processes themselves -- the automatic training process with back propagation, the more manual process of trying different hyper-parameters between training sessions, and the assessment of our final result.\n", + "The solution to this conundrum is to introduce another level of even more highly reserved data, the *test set*. Just as we hold back the validation data from the training process, we must hold back the test set data even from ourselves. It cannot be used to improve the model; it can only be used to evaluate the model at the very end of our efforts. In effect, we define a hierarchy of cuts of our data, based on how fully we want to hide it from training and modeling processes: training data is fully exposed, the validation data is less exposed, and test data is totally hidden. This hierarchy parallels the different kinds of modeling and evaluation processes themselves--the automatic training process with back propagation, the more manual process of trying different hyper-parameters between training sessions, and the assessment of our final result.\n", "\n", - "The test and validation sets should have enough data to ensure that you get a good estimate of your accuracy. If you're creating a cat detector, for instance, you generally want at least 30 cats in your validation set. That means that if you have a dataset with thousands of items, using the default 20% validation set size may be larger than you need. On the other hand, if you have lots of data, using some of it for the validation probably doesn't have any downsides.\n", + "The test and validation sets should have enough data to ensure that you get a good estimate of your accuracy. If you're creating a cat detector, for instance, you generally want at least 30 cats in your validation set. That means that if you have a dataset with thousands of items, using the default 20% validation set size may be more than you need. On the other hand, if you have lots of data, using some of it for validation probably doesn't have any downsides.\n", "\n", - "Having two levels of \"reserved data\", a validation set and a test set -- with one level representing data which you are virtually hiding from yourself -- may seem a bit extreme. But the reason it is often necessary is because models tend to gravitate toward the simplest way to do good predictions (memorization), and we as fallible humans tend to gravitate toward fooling ourselves about how well our models are performing. The discipline of the test set helps us keep ourselves intellectually honest. That doesn't mean we *always* need a separate test set--if you have very little data, you may need to just have a validation set--but generally it's best to use one if at all possible.\n", + "Having two levels of \"reserved data\"--a validation set and a test set, with one level representing data that you are virtually hiding from yourself--may seem a bit extreme. But the reason it is often necessary is because models tend to gravitate toward the simplest way to do good predictions (memorization), and we as fallible humans tend to gravitate toward fooling ourselves about how well our models are performing. The discipline of the test set helps us keep ourselves intellectually honest. That doesn't mean we *always* need a separate test set--if you have very little data, you may need to just have a validation set--but generally it's best to use one if at all possible.\n", "\n", - "This same discipline can be critical if you intend to hire a third-party to perform modelling work on your behalf. A third-party might not understand your requirements accurately, or their incentives might even encourage them to misunderstand them. But a good test set can greatly mitigate these risks and let you evaluate if their work solves your actual problem.\n", + "This same discipline can be critical if you intend to hire a third party to perform modeling work on your behalf. A third party might not understand your requirements accurately, or their incentives might even encourage them to misunderstand them. A good test set can greatly mitigate these risks and let you evaluate whether their work solves your actual problem.\n", "\n", - "To put it bluntly, if you're a senior decision maker in your organization (or you're advising senior decision makers) then the most important takeaway is this: if you ensure that you really understand what test and validation sets are, and why they're important, then you'll avoid the single biggest source of failures we've seen when organizations decide to use AI. For instance, if you're considering bringing in an external vendor or service, make sure that you hold out some test data that the vendor *never gets to see*. Then *you* check their model on your test data, using a metric that *you* choose based on what actually matters to you in practice, and *you* decide what level of performance is adequate. (It's also a good idea for you to try out some simple baseline yourself, so you know what a really simple model can achieve. Often it'll turn out that your simple model can be just as good as an external \"expert\"!)" + "To put it bluntly, if you're a senior decision maker in your organization (or you're advising senior decision makers), the most important takeaway is this: if you ensure that you really understand what test and validation sets are and why they're important, then you'll avoid the single biggest source of failures we've seen when organizations decide to use AI. For instance, if you're considering bringing in an external vendor or service, make sure that you hold out some test data that the vendor *never gets to see*. Then *you* check their model on your test data, using a metric that *you* choose based on what actually matters to you in practice, and *you* decide what level of performance is adequate. (It's also a good idea for you to try out some simple baseline yourself, so you know what a really simple model can achieve. Often it'll turn out that your simple model performs just as well as one produced by an external \"expert\"!)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Use judgment in defining test sets" + "### Use Judgment in Defining Test Sets" ] }, { @@ -2696,9 +2724,9 @@ "source": [ "To do a good job of defining a validation set (and possibly a test set), you will sometimes want to do more than just randomly grab a fraction of your original dataset. Remember: a key property of the validation and test sets is that they must be representative of the new data you will see in the future. This may sound like an impossible order! By definition, you haven’t seen this data yet. But you usually still do know some things.\n", "\n", - "It's instructive to look at a few example cases. Many of these examples come from predictive modeling competitions on the *Kaggle* platform, which is a good representation of problems and methods you would see in practice.\n", + "It's instructive to look at a few example cases. Many of these examples come from predictive modeling competitions on the [Kaggle](https://www.kaggle.com/) platform, which is a good representation of problems and methods you might see in practice.\n", "\n", - "One case might be if you are looking at time series data. For a time series, choosing a random subset of the data will be both too easy (you can look at the data both before and after the dates your are trying to predict) and not representative of most business use cases (where you are using historical data to build a model for use in the future). If your data includes the date and you are building a model to use in the future, you will want to choose a continuous section with the latest dates as your validation set (for instance, the last two weeks or last month of the available data).\n", + "One case might be if you are looking at time series data. For a time series, choosing a random subset of the data will be both too easy (you can look at the data both before and after the dates your are trying to predict) and not representative of most business use cases (where you are using historical data to build a model for use in the future). If your data includes the date and you are building a model to use in the future, you will want to choose a continuous section with the latest dates as your validation set (for instance, the last two weeks or last month of available data).\n", "\n", "Suppose you want to split the time series data in <> into training and validation sets." ] @@ -2728,7 +2756,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Use the earlier data as your training set (and the later data for the validation set), as shown in <>." + "Instead, use the earlier data as your training set (and the later data for the validation set), as shown in <>." ] }, { @@ -2742,16 +2770,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For example, Kaggle had a competition to [predict the sales in a chain of Ecuadorian grocery stores](https://www.kaggle.com/c/favorita-grocery-sales-forecasting). Kaggle's *training data* ran from Jan 1 2013 to Aug 15 2017 and the test data spanned Aug 16 2017 to Aug 31 2017. That way, the competition organizer ensured that entrants were making predictions for a time period that was *in the future*, from the perspective of their model. This is similar to the way quant hedge fund traders do *back-testing* to check whether their models are predictive of future periods, based on past data." + "For example, Kaggle had a competition to [predict the sales in a chain of Ecuadorian grocery stores](https://www.kaggle.com/c/favorita-grocery-sales-forecasting). Kaggle's training data ran from Jan 1 2013 to Aug 15 2017, and the test data spanned Aug 16 2017 to Aug 31 2017. That way, the competition organizer ensured that entrants were making predictions for a time period that was *in the future*, from the perspective of their model. This is similar to the way quant hedge fund traders do *back-testing* to check whether their models are predictive of future periods, based on past data." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "After time series, a second common case is when you can easily anticipate ways the data you will be making predictions for in production may be *qualitatively different* from the data you have to train your model with.\n", + "A second common case is when you can easily anticipate ways the data you will be making predictions for in production may be *qualitatively different* from the data you have to train your model with.\n", "\n", - "In the Kaggle [distracted driver competition](https://www.kaggle.com/c/state-farm-distracted-driver-detection), the independent variables are pictures of drivers at the wheel of a car, and the dependent variable is a category such as texting, eating, or safely looking ahead. Lots of pictures were of the same drivers in different positions, as we can see in <>. If you were the insurance company building a model from this data, note that you would be most interested in how the model performs on drivers you haven't seen before (since you would likely have training data only for a small group of people). This is true of the Kaggle competition as well: the test data consists of people that weren't used in the training set." + "In the Kaggle [distracted driver competition](https://www.kaggle.com/c/state-farm-distracted-driver-detection), the independent variables are pictures of drivers at the wheel of a car, and the dependent variables are categories such as texting, eating, or safely looking ahead. Lots of pictures are of the same drivers in different positions, as we can see in <>. If you were an insurance company building a model from this data, note that you would be most interested in how the model performs on drivers it hasn't seen before (since you would likely have training data only for a small group of people). In recognition of this, the test data for the competition consists of images of people that don't appear in the training set." ] }, { @@ -2765,7 +2793,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If you put one of the above images in your training set and one in the validation set, your model will seem to be performing better than it would on new people. Another perspective is that if you used all the people in training your model, your model may be overfitting to particularities of those specific people, and not just learning the states (texting, eating, etc).\n", + "If you put one of the images in <> in your training set and one in the validation set, your model will have an easy time making a prediction for the one in the validation set, so it will seem to be performing better than it would on new people. Another perspective is that if you used all the people in training your model, your model might be overfitting to particularities of those specific people, and not just learning the states (texting, eating, etc.).\n", "\n", "A similar dynamic was at work in the [Kaggle fisheries competition](https://www.kaggle.com/c/the-nature-conservancy-fisheries-monitoring) to identify the species of fish caught by fishing boats in order to reduce illegal fishing of endangered populations. The test set consisted of boats that didn't appear in the training data. This means that you'd want your validation set to include boats that are not in the training set.\n", "\n", @@ -2776,7 +2804,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now that you have got a taste of how to build a model, you can decide what you want to dig into next." + "Now that you have gotten a taste of how to build a model, you can decide what you want to dig into next." ] }, { @@ -2790,9 +2818,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If you would like to learn more about how to use deep learning models in practice, including identifying and fixing errors, and creating a real working web application, and how to avoid your model causing unexpected harm to your organization or society more generally, then keep reading the next chapters, _From model to production_, and _Data ethics_. If you would like to start learning the foundations of how deep learning works *under the hood*, skip to <>, _Under the hood: training a digit classifier_. (Did you ever read _Choose Your Own Adventure_ books as a kid? Well, this is kind of like that… except with more deep learning than that book series contained.)\n", + "If you would like to learn more about how to use deep learning models in practice, including how to identify and fix errors, create a real working web application, and avoid your model causing unexpected harm to your organization or society more generally, then keep reading the next two chapters. If you would like to start learning the foundations of how deep learning works under the hood, skip to <>. (Did you ever read _Choose Your Own Adventure_ books as a kid? Well, this is kind of like that… except with more deep learning than that book series contained.)\n", "\n", - "Either way, you will need to read all these chapters in order to progress further in the book; but it is totally up to you which order you read them in. They don't depend on each other. If you skip ahead to <>, then we will remind you at the end of that section to come back and read the chapters you skipped over before you go any further." + "You will need to read all these chapters to progress further in the book, but it is totally up to you which order you read them in. They don't depend on each other. If you skip ahead to <>, we will remind you at the end to come back and read the chapters you skipped over before you go any further." ] }, { @@ -2806,7 +2834,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It can be hard to know in pages and pages of prose what are the key things you really need to focus on and remember. So we've prepared a list of questions and suggested steps to complete at the end of each chapter. All the answers are in the text of the chapter, so if you're not sure about anything here, re-read that part of the text and make sure you understand it. Answers to all these questions are also available on the [book website](https://book.fast.ai). You can also visit [the forums](https://forums.fast.ai) if you get stuck to get help from other folks studying this material." + "It can be hard to know in pages and pages of prose what the key things are that you really need to focus on and remember. So, we've prepared a list of questions and suggested steps to complete at the end of each chapter. All the answers are in the text of the chapter, so if you're not sure about anything here, reread that part of the text and make sure you understand it. Answers to all these questions are also available on the [book's website](https://book.fast.ai). You can also visit [the forums](https://forums.fast.ai) if you get stuck to get help from other folks studying this material." ] }, { @@ -2814,33 +2842,35 @@ "metadata": {}, "source": [ "1. Do you need these for deep learning?\n", + "\n", " - Lots of math T / F\n", " - Lots of data T / F\n", " - Lots of expensive computers T / F\n", " - A PhD T / F\n", + " \n", "1. Name five areas where deep learning is now the best in the world.\n", "1. What was the name of the first device that was based on the principle of the artificial neuron?\n", - "1. Based on the book of the same name, what are the requirements for \"Parallel Distributed Processing\"?\n", + "1. Based on the book of the same name, what are the requirements for parallel distributed processing (PDP)?\n", "1. What were the two theoretical misunderstandings that held back the field of neural networks?\n", "1. What is a GPU?\n", "1. Open a notebook and execute a cell containing: `1+1`. What happens?\n", "1. Follow through each cell of the stripped version of the notebook for this chapter. Before executing each cell, guess what will happen.\n", "1. Complete the Jupyter Notebook online appendix.\n", "1. Why is it hard to use a traditional computer program to recognize images in a photo?\n", - "1. What did Samuel mean by \"Weight Assignment\"?\n", - "1. What term do we normally use in deep learning for what Samuel called \"Weights\"?\n", - "1. Draw a picture that summarizes Arthur Samuel's view of a machine learning model\n", + "1. What did Samuel mean by \"weight assignment\"?\n", + "1. What term do we normally use in deep learning for what Samuel called \"weights\"?\n", + "1. Draw a picture that summarizes Samuel's view of a machine learning model.\n", "1. Why is it hard to understand why a deep learning model makes a particular prediction?\n", - "1. What is the name of the theorem that a neural network can solve any mathematical problem to any level of accuracy?\n", + "1. What is the name of the theorem that shows that a neural network can solve any mathematical problem to any level of accuracy?\n", "1. What do you need in order to train a model?\n", "1. How could a feedback loop impact the rollout of a predictive policing model?\n", - "1. Do we always have to use 224x224 pixel images with the cat recognition model?\n", + "1. Do we always have to use 224\\*224-pixel images with the cat recognition model?\n", "1. What is the difference between classification and regression?\n", "1. What is a validation set? What is a test set? Why do we need them?\n", "1. What will fastai do if you don't provide a validation set?\n", "1. Can we always use a random sample for a validation set? Why or why not?\n", "1. What is overfitting? Provide an example.\n", - "1. What is a metric? How does it differ to \"loss\"?\n", + "1. What is a metric? How does it differ from \"loss\"?\n", "1. How can pretrained models help?\n", "1. What is the \"head\" of a model?\n", "1. What kinds of features do the early layers of a CNN find? How about the later layers?\n", @@ -2856,14 +2886,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Each chapter also has a \"further research\" with questions that aren't fully answered in the text, or include more advanced assignments. Answers to these questions aren't on the book website--you'll need to do your own research!" + "Each chapter also has a \"Further Research\" section that poses questions that aren't fully answered in the text, or gives more advanced assignments. Answers to these questions aren't on the book's website; you'll need to do your own research!" ] }, { @@ -2871,8 +2901,15 @@ "metadata": {}, "source": [ "1. Why is a GPU useful for deep learning? How is a CPU different, and why is it less effective for deep learning?\n", - "1. Try to think of three areas where feedback loops might impact use of machine learning. See if you can find documented examples of that happening in practice." + "1. Try to think of three areas where feedback loops might impact the use of machine learning. See if you can find documented examples of that happening in practice." ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/02_production.ipynb b/02_production.ipynb index bde596f6c..e47a5d57c 100644 --- a/02_production.ipynb +++ b/02_production.ipynb @@ -22,14 +22,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# From model to production" + "# From Model to Production" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The five lines of code we saw in <> are just one small part of the process of using deep learning in practice. In this chapter, we're going to use a computer vision example to look at the end-to-end process of creating a deep learning application. More specifically: we're going to build a bear classifier! In the process, we'll discuss the capabilities and constraints of deep learning, learn about how to create datasets, look at possible gotchas when using deep learning in practice, and more. Many of the key points will apply equally well to other deep learning problems, such as we showed in <>. If you work through a problem similar in key respects to our example problems, we expect you to get excellent results with little code, quickly.\n", + "The six lines of code we saw in <> are just one small part of the process of using deep learning in practice. In this chapter, we're going to use a computer vision example to look at the end-to-end process of creating a deep learning application. More specifically, we're going to build a bear classifier! In the process, we'll discuss the capabilities and constraints of deep learning, explore how to create datasets, look at possible gotchas when using deep learning in practice, and more. Many of the key points will apply equally well to other deep learning problems, such as those in <>. If you work through a problem similar in key respects to our example problems, we expect you to get excellent results with little code, quickly.\n", "\n", "Let's start with how you should frame your problem." ] @@ -38,51 +38,49 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The practice of deep learning" + "## The Practice of Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We've seen that deep learning can solve a lot of challenging problems quickly and with little code. As a beginner there's a sweet spot of problems that are similar enough to our example problems that you can very quickly get extremely useful results. However, deep learning isn't magic! The same 5 lines of code won't work on every problem anyone can think of today. Underestimating the constraints and overestimating the capabilities of deep learning may lead to frustratingly poor results. At least until you gain some experience to solve the problems that arise. Overestimating the constraints and underestimating the capabilities of deep learning may mean you do not attempt a solvable problem because you talk yourself out of it. \n", + "We've seen that deep learning can solve a lot of challenging problems quickly and with little code. As a beginner, there's a sweet spot of problems that are similar enough to our example problems that you can very quickly get extremely useful results. However, deep learning isn't magic! The same 6 lines of code won't work for every problem anyone can think of today. Underestimating the constraints and overestimating the capabilities of deep learning may lead to frustratingly poor results, at least until you gain some experience and can solve the problems that arise. Conversely, overestimating the constraints and underestimating the capabilities of deep learning may mean you do not attempt a solvable problem because you talk yourself out of it. \n", "\n", - "We often talk to people who underestimate both the constraints and the capabilities of deep learning. Both of these can be problems: underestimating the capabilities means that you might not even try things which could be very beneficial; underestimating the constraints might mean that you fail to consider and react to important issues.\n", + "We often talk to people who underestimate both the constraints and the capabilities of deep learning. Both of these can be problems: underestimating the capabilities means that you might not even try things that could be very beneficial, and underestimating the constraints might mean that you fail to consider and react to important issues.\n", "\n", - "The best thing to do is to keep an open mind. If you remain open to the possibility that deep learning might solve part of your problem with less data or complexity than you expect, then it is possible to design a process where you can find the specific capabilities and constraints related to your particular problem as you work through the process. This doesn't mean making any risky bets — we will show you how you can gradually roll out models so that they don't create significant risks, and can even backtest them prior to putting them in production.\n", - "\n", - "Let's start with how you should frame your problem." + "The best thing to do is to keep an open mind. If you remain open to the possibility that deep learning might solve part of your problem with less data or complexity than you expect, then it is possible to design a process where you can find the specific capabilities and constraints related to your particular problem as you work through the process. This doesn't mean making any risky bets — we will show you how you can gradually roll out models so that they don't create significant risks, and can even backtest them prior to putting them in production." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Starting your project" + "### Starting Your Project" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "So where should you start your deep learning journey? The most important thing is to ensure that you have some project that you are working on — it is only through working on your own projects that you will get the real experience of building and using models. When selecting a project, the most important consideration is data availability. Regardless of whether you are doing a project just for your own learning or for practical application in your organization, you want something where you can get started quickly. We have seen many students, researchers, and industry practitioners waste months or years while they attempt to find their perfect dataset. The goal is not to find the perfect dataset or the perfect project, but just to get started and iterate from there.\n", + "So where should you start your deep learning journey? The most important thing is to ensure that you have some project to work on—it is only through working on your own projects that you will get real experience building and using models. When selecting a project, the most important consideration is data availability. Regardless of whether you are doing a project just for your own learning or for practical application in your organization, you want something where you can get started quickly. We have seen many students, researchers, and industry practitioners waste months or years while they attempt to find their perfect dataset. The goal is not to find the \"perfect\" dataset or project, but just to get started and iterate from there.\n", "\n", - "If you take this approach, then you will be on your third iteration of learning and improving whilst the perfectionists are still in the planning stages!\n", + "If you take this approach, then you will be on your third iteration of learning and improving while the perfectionists are still in the planning stages!\n", "\n", - "We also suggest that you iterate from end to end in your project; that is, don't spend months fine tuning your model, or polishing the perfect GUI, or labelling the perfect dataset… Instead, complete every step as well as you can in a reasonable amount of time, all the way to the end. For instance, if your final goal is an application that runs on a mobile phone, then that should be what you have after each iteration. But perhaps in the early iterations you take some shortcuts, for instance by doing all of the processing on a remote server, and using a simple responsive web application. By completing the project end to end, you will see where the most tricky bits are, and which bits make the biggest difference to the final result." + "We also suggest that you iterate from end to end in your project; that is, don't spend months fine-tuning your model, or polishing the perfect GUI, or labelling the perfect dataset… Instead, complete every step as well as you can in a reasonable amount of time, all the way to the end. For instance, if your final goal is an application that runs on a mobile phone, then that should be what you have after each iteration. But perhaps in the early iterations you take some shortcuts, for instance by doing all of the processing on a remote server, and using a simple responsive web application. By completing the project end to end, you will see where the trickiest bits are, and which bits make the biggest difference to the final result." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "As you work through this book, we suggest that you both complete lots of small experiments, by running and adjusting the notebooks we provide, at the same time that you gradually develop your own projects. That way, you will be getting experience with all of the tools and techniques that we're explaining, as we discuss them.\n", + "As you work through this book, we suggest that you complete lots of small experiments, by running and adjusting the notebooks we provide, at the same time that you gradually develop your own projects. That way, you will be getting experience with all of the tools and techniques that we're explaining, as we discuss them.\n", "\n", - "> s: To make the most of this book, take the time to experiment between each chapter, be it on your own project or exploring the notebooks we provide. Then try re-writing those notebooks from scratch on a new dataset. It's only by practicing (and failing) a lot that you will get an intuition on how to train a model. \n", + "> s: To make the most of this book, take the time to experiment between each chapter, be it on your own project or by exploring the notebooks we provide. Then try rewriting those notebooks from scratch on a new dataset. It's only by practicing (and failing) a lot that you will get an intuition of how to train a model. \n", "\n", - "By using the end to end iteration approach you will also get a better understanding of how much data you really need. For instance, you may find you can only easily get 200 labelled data items, and you can't really know until you try whether that's enough to get the performance you need for your application to work well in practice.\n", + "By using the end-to-end iteration approach you will also get a better understanding of how much data you really need. For instance, you may find you can only easily get 200 labeled data items, and you can't really know until you try whether that's enough to get the performance you need for your application to work well in practice.\n", "\n", - "In an organizational context you will be able to show your colleagues that your idea can really work, by showing them a real working prototype. We have repeatedly observed that this is the secret to getting good organizational buy-in for a project." + "In an organizational context you will be able to show your colleagues that your idea can really work by showing them a real working prototype. We have repeatedly observed that this is the secret to getting good organizational buy-in for a project." ] }, { @@ -93,21 +91,21 @@ "\n", "Sometimes, you have to get a bit creative. Maybe you can find some previous machine learning project, such as a Kaggle competition, that is related to your field of interest. Sometimes, you have to compromise. Maybe you can't find the exact data you need for the precise project you have in mind; but you might be able to find something from a similar domain, or measured in a different way, tackling a slightly different problem. Working on these kinds of similar projects will still give you a good understanding of the overall process, and may help you identify other shortcuts, data sources, and so forth.\n", "\n", - "Especially when you are just starting out with deep learning, it's not a good idea to branch out into very different areas to places that deep learning has not been applied to before. That's because if your model does not work at first, you will not know whether it is because you have made a mistake, or if the very problem you are trying to solve is simply not solvable with deep learning. And you won't know where to look to get help. Therefore, it is best at first to start with something where you can find an example online of somebody who has had good results with something that is at least somewhat similar to what you are trying to achieve, or where you can convert your data into a format similar to what someone else has used before (such as creating an image from your data). Let's have a look at the state of deep learning, just so you know what kinds of things deep learning is good at right now." + "Especially when you are just starting out with deep learning, it's not a good idea to branch out into very different areas, to places that deep learning has not been applied to before. That's because if your model does not work at first, you will not know whether it is because you have made a mistake, or if the very problem you are trying to solve is simply not solvable with deep learning. And you won't know where to look to get help. Therefore, it is best at first to start with something where you can find an example online where somebody has had good results with something that is at least somewhat similar to what you are trying to achieve, or where you can convert your data into a format similar to what someone else has used before (such as creating an image from your data). Let's have a look at the state of deep learning, just so you know what kinds of things deep learning is good at right now." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The state of deep learning" + "### The State of Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's start by considering whether deep learning can be any good at the problem you are looking to work on. In general, here is a summary of the state of deep learning at the start of 2020. However, things move very fast, and by the time you read this some of these constraints may no longer exist. We will try to keep the book website up-to-date; in addition, a Google search for \"what can AI do now\" is likely to provide some up-to-date information." + "Let's start by considering whether deep learning can be any good at the problem you are looking to work on. This section provides a summary of the state of deep learning at the start of 2020. However, things move very fast, and by the time you read this some of these constraints may no longer exist. We will try to keep the book's website up-to-date; in addition, a Google search for \"what can AI do now\" is likely to provide current information." ] }, { @@ -121,9 +119,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There are many domains in which deep learning has not been used to analyse images yet, but those where it has been tried have nearly universally shown that computers can recognise what items are in an image at least as well as people can — even specially trained people, such as radiologists. This is known as *object recognition*. Deep learning is also good at recognizing whereabouts objects in an image are, and can highlight their location and name each found object. This is known as *object detection* (there is also a variant of this we saw in <>, where every pixel is categorized based on what kind of object it is part of--this is called *segmentation*). Deep learning algorithms are generally not good at recognizing images that are significantly different in structure or style to those used to train the model. For instance, if there were no black-and-white images in the training data, the model may do poorly on black-and-white images. If the training data did not contain hand-drawn images then the model will probably do poorly on hand-drawn images. There is no general way to check what types of images are missing in your training set, but we will show in this chapter some ways to try to recognize when unexpected image types arise in the data when the model is being used in production (this is known as checking for *out of domain* data).\n", + "There are many domains in which deep learning has not been used to analyze images yet, but those where it has been tried have nearly universally shown that computers can recognize what items are in an image at least as well as people can—even specially trained people, such as radiologists. This is known as *object recognition*. Deep learning is also good at recognizing where objects in an image are, and can highlight their locations and name each found object. This is known as *object detection* (there is also a variant of this that we saw in <>, where every pixel is categorized based on what kind of object it is part of--this is called *segmentation*). Deep learning algorithms are generally not good at recognizing images that are significantly different in structure or style to those used to train the model. For instance, if there were no black-and-white images in the training data, the model may do poorly on black-and-white images. Similarly, if the training data did not contain hand-drawn images, then the model will probably do poorly on hand-drawn images. There is no general way to check what types of images are missing in your training set, but we will show in this chapter some ways to try to recognize when unexpected image types arise in the data when the model is being used in production (this is known as checking for *out-of-domain* data).\n", "\n", - "One major challenge for object detection systems is that image labelling can be slow and expensive. There is a lot of work at the moment going into tools to try to make this labelling faster and easier, and require fewer handcrafted labels to train accurate object detection models. One approach which is particularly helpful is to synthetically generate variations of input images, such as by rotating them or changing their brightness and contrast; this is called *data augmentation* and also works well for text and other types of model. We will be discussing it in detail in this chapter.\n", + "One major challenge for object detection systems is that image labelling can be slow and expensive. There is a lot of work at the moment going into tools to try to make this labelling faster and easier, and to require fewer handcrafted labels to train accurate object detection models. One approach that is particularly helpful is to synthetically generate variations of input images, such as by rotating them or changing their brightness and contrast; this is called *data augmentation* and also works well for text and other types of models. We will be discussing it in detail in this chapter.\n", "\n", "Another point to consider is that although your problem might not look like a computer vision problem, it might be possible with a little imagination to turn it into one. For instance, if what you are trying to classify are sounds, you might try converting the sounds into images of their acoustic waveforms and then training a model on those images." ] @@ -139,11 +137,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Just like in computer vision, computers are very good at categorising both short and long documents based on categories such as spam, sentiment (e.g. is the review positive or negative), author, source website, and so forth. We are not aware of any rigorous work done in this area to compare to human performance, but anecdotally it seems to us that deep learning performance is similar to human performance here. Deep learning is also very good at generating context-appropriate text, such as replies to social media posts, and imitating a particular author's style. It is also good at making this content compelling to humans, and has been shown to be even more compelling than human-generated text. However, deep learning is currently not good at generating *correct* responses! We don't currently have a reliable way to, for instance, combine a knowledge base of medical information, along with a deep learning model for generating medically correct natural language responses. This is very dangerous, because it is so easy to create content which appears to a layman to be compelling, but actually is entirely incorrect.\n", + "Computers are very good at classifying both short and long documents based on categories such as spam or not spam, sentiment (e.g., is the review positive or negative), author, source website, and so forth. We are not aware of any rigorous work done in this area to compare them to humans, but anecdotally it seems to us that deep learning performance is similar to human performance on these tasks. Deep learning is also very good at generating context-appropriate text, such as replies to social media posts, and imitating a particular author's style. It's good at making this content compelling to humans too--in fact, even more compelling than human-generated text. However, deep learning is currently not good at generating *correct* responses! We don't currently have a reliable way to, for instance, combine a knowledge base of medical information with a deep learning model for generating medically correct natural language responses. This is very dangerous, because it is so easy to create content that appears to a layman to be compelling, but actually is entirely incorrect.\n", "\n", - "Another concern is that context-appropriate, highly compelling responses on social media can be used at massive scale — thousands of times greater than any troll farm previously seen — to spread disinformation, create unrest, and encourage conflict. As a rule of thumb, text generation will always be technologically a bit ahead of the ability of models to recognize automatically generated text. For instance, it is possible to use a model that can recognize artificially generated content to actually improve the generator that creates that content, until the classification model is no longer able to complete its task.\n", + "Another concern is that context-appropriate, highly compelling responses on social media could be used at massive scale—thousands of times greater than any troll farm previously seen—to spread disinformation, create unrest, and encourage conflict. As a rule of thumb, text generation models will always be technologically a bit ahead of models recognizing automatically generated text. For instance, it is possible to use a model that can recognize artificially generated content to actually improve the generator that creates that content, until the classification model is no longer able to complete its task.\n", "\n", - "Despite these issues, deep learning can be used to translate text from one language to another, summarize long documents into something which can be digested more quickly, find all mentions of a concept of interest, and many more. Unfortunately, the translation or summary could well include completely incorrect information! However, it is already good enough that many people are using the systems — for instance Google's online translation system (and every other online service we are aware of) is based on deep learning." + "Despite these issues, deep learning has many applications in NLP: it can be used to translate text from one language to another, summarize long documents into something that can be digested more quickly, find all mentions of a concept of interest, and more. Unfortunately, the translation or summary could well include completely incorrect information! However, the performance is already good enough that many people are using these systems—for instance, Google's online translation system (and every other online service we are aware of) is based on deep learning." ] }, { @@ -159,7 +157,7 @@ "source": [ "The ability of deep learning to combine text and images into a single model is, generally, far better than most people intuitively expect. For example, a deep learning model can be trained on input images with output captions written in English, and can learn to generate surprisingly appropriate captions automatically for new images! But again, we have the same warning that we discussed in the previous section: there is no guarantee that these captions will actually be correct.\n", "\n", - "Because of this serious issue, we generally recommend that deep learning be used not as an entirely automated process, but as part of a process in which the model and a human user interact closely. This can potentially make humans orders of magnitude more productive than they would be with entirely manual methods, and actually result in more accurate processes than using a human alone. For instance, an automatic system can be used to identify potential strokes directly from CT scans, and send a high priority alert to have those scans looked at quickly. There is only a three-hour window to treat strokes, so this fast feedback loop could save lives. At the same time, however, all scans could continue to be sent to radiologists in the usual way, so there would be no reduction in human input. Other deep learning models could automatically measure items seen on the scan, and insert those measurements into reports, warning the radiologist about findings that they may have missed, and tell the radiologist about other cases which might be relevant." + "Because of this serious issue, we generally recommend that deep learning be used not as an entirely automated process, but as part of a process in which the model and a human user interact closely. This can potentially make humans orders of magnitude more productive than they would be with entirely manual methods, and actually result in more accurate processes than using a human alone. For instance, an automatic system can be used to identify potential stroke victims directly from CT scans, and send a high-priority alert to have those scans looked at quickly. There is only a three-hour window to treat strokes, so this fast feedback loop could save lives. At the same time, however, all scans could continue to be sent to radiologists in the usual way, so there would be no reduction in human input. Other deep learning models could automatically measure items seen on the scans, and insert those measurements into reports, warning the radiologists about findings that they may have missed, and telling them about other cases that might be relevant." ] }, { @@ -173,7 +171,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For analysing timeseries and tabular data, deep learning has recently been making great strides. However, deep learning is generally used as part of an ensemble of multiple types of model. If you already have a system that is using random forests or gradient boosting machines (popular tabular modelling tools that we will learn about soon) then switching to, or adding, deep learning may not result in any dramatic improvement. Deep learning does greatly increase the variety of columns that you can include, for example columns containing natural language (e.g. book titles, reviews, etc.), and *high cardinality categorical* columns (i.e. something that contains a large number of discrete choices, such as zip code or product id). On the downside, deep learning models generally take longer to train than random forests or gradient boosting machines, although this is changing thanks to libraries such as [RAPIDS](https://rapids.ai/), which provides GPU acceleration for the whole modeling pipeline. We cover the pros and cons of all these methods in detail in <> in this book." + "For analyzing time series and tabular data, deep learning has recently been making great strides. However, deep learning is generally used as part of an ensemble of multiple types of model. If you already have a system that is using random forests or gradient boosting machines (popular tabular modeling tools that you will learn about soon), then switching to or adding deep learning may not result in any dramatic improvement. Deep learning does greatly increase the variety of columns that you can include--for example, columns containing natural language (book titles, reviews, etc.), and high-cardinality categorical columns (i.e., something that contains a large number of discrete choices, such as zip code or product ID). On the down side, deep learning models generally take longer to train than random forests or gradient boosting machines, although this is changing thanks to libraries such as [RAPIDS](https://rapids.ai/), which provides GPU acceleration for the whole modeling pipeline. We cover the pros and cons of all these methods in detail in <>." ] }, { @@ -187,9 +185,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Recommendation systems are really just a special type of tabular data. In particular, they generally have a high cardinality categorical variable representing users, and another one representing products (or something similar). A company like Amazon represents every purchase that has ever been made as a giant sparse matrix, with customers as the rows and products as the columns. Once they have the data in this format, data scientists apply some form of collaborative filtering to *fill in the matrix*. For example, if customer A buys products 1 and 10, and customer B buys products 1, 2, 4, and 10, the engine will recommend that A buy 2 and 4. Because deep learning models are good at handling high cardinality categorical variables, they are quite good at handling recommendation systems. They particularly come into their own, just like for tabular data, when combining these variables with other kinds of data, such as natural language or images. They can also do a good job of combining all of these types of information with additional meta data represented as tables, such as user information, previous transactions, and so forth.\n", + "Recommendation systems are really just a special type of tabular data. In particular, they generally have a high-cardinality categorical variable representing users, and another one representing products (or something similar). A company like Amazon represents every purchase that has ever been made by its customers as a giant sparse matrix, with customers as the rows and products as the columns. Once they have the data in this format, data scientists apply some form of collaborative filtering to *fill in the matrix*. For example, if customer A buys products 1 and 10, and customer B buys products 1, 2, 4, and 10, the engine will recommend that A buy 2 and 4. Because deep learning models are good at handling high-cardinality categorical variables, they are quite good at handling recommendation systems. They particularly come into their own, just like for tabular data, when combining these variables with other kinds of data, such as natural language or images. They can also do a good job of combining all of these types of information with additional metadata represented as tables, such as user information, previous transactions, and so forth.\n", "\n", - "However, nearly all machine learning approaches have the downside that they only tell you what products a particular user might like, rather than what recommendations would be helpful for a user. Many kinds of recommendations for products a user might like may not be at all helpful, for instance, if the user is already familiar with the products, or if they are simply different packagings of products they have already purchased (such as a boxed set of novels, where they already have each of the items in that set). Jeremy likes reading books by Terry Pratchett, and for a while Amazon was recommending nothing but Terry Pratchett books to him (see <>), which really wasn't helpful because he already was aware of these books!" + "However, nearly all machine learning approaches have the downside that they only tell you what products a particular user might like, rather than what recommendations would be helpful for a user. Many kinds of recommendations for products a user might like may not be at all helpful--for instance, if the user is already familiar with the products, or if they are simply different packagings of products they have already purchased (such as a boxed set of novels, when they already have each of the items in that set). Jeremy likes reading books by Terry Pratchett, and for a while Amazon was recommending nothing but Terry Pratchett books to him (see <>), which really wasn't helpful because he already was aware of these books!" ] }, { @@ -203,85 +201,97 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "**Other data types**: Often you will find that domain-specific data types fit very nicely into existing categories. For instance, protein chains look a lot like natural language documents, in that they are long sequences of discrete tokens with complex relationships and meaning throughout the sequence. And indeed, it does turn out that using NLP deep learning methods is the current state of the art approach for many types of protein analysis. As another example: sounds can be represented as spectrograms, which can be treated as images; standard deep learning approaches for images turn out to work really well on spectrograms." + "#### Other data types" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "There are many accurate models that are of no use to anyone, and many inaccurate models that are highly useful. To ensure that your modeling work is useful in practice, you need to consider how your work will be used. In 2012 Jeremy, along with Margit Zwemer and Mike Loukides, introduced a method called *The Drivetrain Approach* for thinking about this issue." + "Often you will find that domain-specific data types fit very nicely into existing categories. For instance, protein chains look a lot like natural language documents, in that they are long sequences of discrete tokens with complex relationships and meaning throughout the sequence. And indeed, it does turn out that using NLP deep learning methods is the current state-of-the-art approach for many types of protein analysis. As another example, sounds can be represented as spectrograms, which can be treated as images; standard deep learning approaches for images turn out to work really well on spectrograms." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The Drivetrain approach" + "### The Drivetrain Approach" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The Drivetrain approach, illustrated in <>, was described in detail in [Designing Great Data Products](https://www.oreilly.com/radar/drivetrain-approach-data-products/). The basic idea is to start with considering your objective, then think about what you can actually do to change that objective (\"levers\"), what data you have that might help you connect potential changes to levers to changes in your objective, and then to build a model of that. You can then use that model to find the best actions (that is, changes to levers) to get the best results in terms of your objective.\n", - "\n", - "Consider a model in an autonomous vehicle: you want to help a car drive safely from point A to point B without human intervention. Great predictive modeling is an important part of the solution, but it doesn't stand on its own; as products become more sophisticated, it disappears into the plumbing. Someone using a self-driving car is completely unaware of the hundreds (if not thousands) of models and the petabytes of data that make it work. But as data scientists build increasingly sophisticated products, they need a systematic design approach.\n", - "\n", - "We use data not just to generate more data (in the form of predictions), but to produce *actionable outcomes*. That is the goal of the Drivetrain Approach. Start by defining a clear **objective**. For instance, Google, when creating their first search engine, considered \"What is the user’s main objective in typing in a search query?\", and their answer was \"show the most relevant search result\". The next step is to consider what **levers** you can pull (i.e. what actions could you take) to better achieve that objective. In Google's case, that was the ranking of the search results. The third step was to consider what new **data** they would need to produce such a ranking; they realized that the implicit information regarding which pages linked to which other pages could be used for this purpose. Only after these first three steps do we begin thinking about building the predictive **models**. Our objective and available levers, what data we already have and what additional data we will need to collect, determine the models we can build. The models will take both the levers and any uncontrollable variables as their inputs; the outputs from the models can be combined to predict the final state for our objective." + "There are many accurate models that are of no use to anyone, and many inaccurate models that are highly useful. To ensure that your modeling work is useful in practice, you need to consider how your work will be used. In 2012 Jeremy, along with Margit Zwemer and Mike Loukides, introduced a method called *the Drivetrain Approach* for thinking about this issue." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Drivetrain Approach, illustrated in <>, was described in detail in [\"Designing Great Data Products\"](https://www.oreilly.com/radar/drivetrain-approach-data-products/). The basic idea is to start with considering your objective, then think about what actions you can take to meet that objective and what data you have (or can acquire) that can help, and then build a model that you can use to determine the best actions to take to get the best results in terms of your objective." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "" + "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's consider another example: recommendation systems. The **objective** of a recommendation engine is to drive additional sales by surprising and delighting the customer with recommendations of items they would not have purchased without the recommendation. The **lever** is the ranking of the recommendations. New **data** must be collected to generate recommendations that will *cause new sales*. This will require conducting many randomized experiments in order to collect data about a wide range of recommendations for a wide range of customers. This is a step that few organizations take; but without it, you don't have the information you need to actually optimize recommendations based on your true objective (more sales!)\n", + "Consider a model in an autonomous vehicle: you want to help a car drive safely from point A to point B without human intervention. Great predictive modeling is an important part of the solution, but it doesn't stand on its own; as products become more sophisticated, it disappears into the plumbing. Someone using a self-driving car is completely unaware of the hundreds (if not thousands) of models and the petabytes of data that make it work. But as data scientists build increasingly sophisticated products, they need a systematic design approach.\n", "\n", - "Finally, you could build two **models** for purchase probabilities, conditional on seeing or not seeing a recommendation. The difference between these two probabilities is a utility function for a given recommendation to a customer. It will be low in cases where the algorithm recommends a familiar book that the customer has already rejected (both components are small) or a book that he or she would have bought even without the recommendation (both components are large and cancel each other out).\n", + "We use data not just to generate more data (in the form of predictions), but to produce *actionable outcomes*. That is the goal of the Drivetrain Approach. Start by defining a clear *objective*. For instance, Google, when creating their first search engine, considered \"What is the user’s main objective in typing in a search query?\" This led them to their objective, which was to \"show the most relevant search result.\" The next step is to consider what *levers* you can pull (i.e., what actions you can take) to better achieve that objective. In Google's case, that was the ranking of the search results. The third step was to consider what new *data* they would need to produce such a ranking; they realized that the implicit information regarding which pages linked to which other pages could be used for this purpose. Only after these first three steps do we begin thinking about building the predictive *models*. Our objective and available levers, what data we already have and what additional data we will need to collect, determine the models we can build. The models will take both the levers and any uncontrollable variables as their inputs; the outputs from the models can be combined to predict the final state for our objective." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's consider another example: recommendation systems. The *objective* of a recommendation engine is to drive additional sales by surprising and delighting the customer with recommendations of items they would not have purchased without the recommendation. The *lever* is the ranking of the recommendations. New *data* must be collected to generate recommendations that will *cause new sales*. This will require conducting many randomized experiments in order to collect data about a wide range of recommendations for a wide range of customers. This is a step that few organizations take; but without it, you don't have the information you need to actually optimize recommendations based on your true objective (more sales!).\n", "\n", - "As you can see, in practice often the practical implementation of your model will require a lot more than just training a model! You'll often need to run experiments to collect more data, and consider how to incorporate your models into the overall system you're developing. Speaking of data, let's now focus on how to find data for your project." + "Finally, you could build two *models* for purchase probabilities, conditional on seeing or not seeing a recommendation. The difference between these two probabilities is a utility function for a given recommendation to a customer. It will be low in cases where the algorithm recommends a familiar book that the customer has already rejected (both components are small) or a book that they would have bought even without the recommendation (both components are large and cancel each other out).\n", + "\n", + "As you can see, in practice often the practical implementation of your models will require a lot more than just training a model! You'll often need to run experiments to collect more data, and consider how to incorporate your models into the overall system you're developing. Speaking of data, let's now focus on how to find data for your project." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Gathering data" + "## Gathering Data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "For many types of projects, you may be able to find all the data you need online. The project we'll be completing in this chapter is a *bear detector*. It will discriminate between three types of bear: grizzly, black, and teddy bear. There are many images on the Internet of each type of bear we can use. We just need a way to find them and download them. We've provided a tool you can use for this purpose, so you can follow along with this chapter, creating your own image recognition application for whatever kinds of object you're interested in. In the fast.ai course, thousands of students have presented their work on the course forums, displaying everything from Trinidad hummingbird varieties to Panama bus types, and even an application that helped one student let his fiancée recognize his sixteen cousins during Christmas vacation!" + "For many types of projects, you may be able to find all the data you need online. The project we'll be completing in this chapter is a *bear detector*. It will discriminate between three types of bear: grizzly, black, and teddy bears. There are many images on the internet of each type of bear that we can use. We just need a way to find them and download them. We've provided a tool you can use for this purpose, so you can follow along with this chapter and create your own image recognition application for whatever kinds of objects you're interested in. In the fast.ai course, thousands of students have presented their work in the course forums, displaying everything from hummingbird varieties in Trinidad to bus types in Panama--one student even created an application that would help his fiancée recognize his 16 cousins during Christmas vacation!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "At the time of writing, Bing Image Search is the best option we know of for finding and downloading images. It's free for up to 1000 queries per month, and each query can download up to 150 images. However, something better might have come along between when we wrote this and when you're reading the book, so be sure to check out [book.fast.ai](https://book.fast.ai) where we'll let you know our current recommendation." + "At the time of writing, Bing Image Search is the best option we know of for finding and downloading images. It's free for up to 1,000 queries per month, and each query can download up to 150 images. However, something better might have come along between when we wrote this and when you're reading the book, so be sure to check out the book's website for our current recommendation." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> important: Services that can be used for creating datasets come and go all the time, and their features, interfaces, and pricing change regularly too. In this section, we'll show how to use one particular provider, _Bing Image Search_, using the service they have as this book was written. We'll be providing more options and more up to date information on the [book website](https://book.fast.ai), so be sure to have a look there now to get the most current information on how to download images from the web to create a dataset for deep learning." + "> important: Keeping in Touch With the Latest Services: Services that can be used for creating datasets come and go all the time, and their features, interfaces, and pricing change regularly too. In this section, we'll show how to use the Bing Image Search API available as part of Azure Cognitive Services at the time this book was written. We'll be providing more options and more up to date information on the https://book.fast.ai[book's website], so be sure to have a look there now to get the most current information on how to download images from the web to create a dataset for deep learning." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To download images with Bing Image Search, you should sign up at Microsoft for Bing Image Search. You will be given a key, which you can either paste here, replacing \"XXX\":" + "To download images with Bing Image Search, sign up at Microsoft for a free account. You will be given a key, which you can copy and enter in a cell as follows (replacing 'XXX' with your key and executing it):" ] }, { @@ -289,7 +299,7 @@ "metadata": {}, "source": [ "#clean\n", - "To download images with Bing Image Search, you should sign up at Microsoft for *Bing Image Search*. You will be given a key, which you can either paste here, replacing \"XXX\":" + "To download images with Bing Image Search, sign up at Microsoft for a free account. You will be given a key, which you can copy and enter in a cell as follows (replacing 'XXX' with your key and executing it):" ] }, { @@ -305,17 +315,17 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...or, if you're comfortable at the command line, you can set it in your terminal with:\n", + "Or, if you're comfortable at the command line, you can set it in your terminal with:\n", "\n", " export AZURE_SEARCH_KEY=your_key_here\n", "\n", - "and then restart Jupyter notebooks, and finally execute in this notebook:\n", + "and then restart Jupyter Notebook, type this in a cell and execute it:\n", "\n", "```python\n", "key = os.environ['AZURE_SEARCH_KEY']\n", "```\n", "\n", - "Once you've set `key`, you can use `search_images_bing`. This function is provided by the small `utils` class included in the book. Remember, if you're not sure where a symbol is defined, you can just type it in your notebook to find out (or prefix with `?` to get help, including the name of the file where it's defined, or with `??` to get its source code):" + "Once you've set `key`, you can use `search_images_bing`. This function is provided by the small `utils` class included with the notebooks online. If you're not sure where a function is defined, you can just type it in your notebook to find out:" ] }, { @@ -415,7 +425,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This seems to have worked nicely, so let's use fastai's `download_images` to download all the URLs from each of our search terms. We'll put each in a separate folder." + "This seems to have worked nicely, so let's use fastai's `download_images` to download all the URLs for each of our search terms. We'll put each in a separate folder:" ] }, { @@ -482,7 +492,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Often when we download files from the Internet, there are a few that are corrupt. Let's check:" + "Often when we download files from the internet, there are a few that are corrupt. Let's check:" ] }, { @@ -520,7 +530,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To remove the failed images, we can use `unlink` on each. Note that, like most fastai functions that return a collection, `verify_images` returns an object of type `L`, which includes the `map` method. This calls the passed function on each element of the collection." + "To remove all the failed images, you can use `unlink` on each of them. Note that, like most fastai functions that return a collection, `verify_images` returns an object of type `L`, which includes the `map` method. This calls the passed function on each element of the collection:" ] }, { @@ -536,14 +546,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Getting help in Jupyter notebooks" + "### Sidebar: Getting Help in Jupyter Notebooks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Jupyter notebooks are great to easily experiment and immediately see the results of each function, but there is also a lot of functionality to help figure out how to use the functions you have or even directly look at their source code. For instance, if you type in a cell\n", + "Jupyter notebooks are great for experimenting and immediately seeing the results of each function, but there is also a lot of functionality to help you figure out how to use different functions, or even directly look at their source code. For instance, if you type in a cell:\n", "```\n", "??verify_images\n", "```\n", @@ -558,16 +568,16 @@ "File: ~/git/fastai/fastai/vision/utils.py\n", "Type: function\n", "```\n", - "It tells us what argument the function accepts (`fns`) then shows us the source code and the file it comes from. Looking at that source code, we can see it applies the function `verify_image` in parallel and only keeps the ones for which the result of that function is `False`, which is consistent with the doc string: it finds the images in `fns` that can't be opened.\n", + "This tells us what argument the function accepts (`fns`), then shows us the source code and the file it comes from. Looking at that source code, we can see it applies the function `verify_image` in parallel and only keeps the image files for which the result of that function is `False`, which is consistent with the doc string: it finds the images in `fns` that can't be opened.\n", "\n", - "Here are the commands that are very useful in Jupyter notebooks:\n", + "Here are some other features that are very useful in Jupyter notebooks:\n", "\n", - "- at any point, if you don't remember the exact spelling of a function or argument name, you can press \"tab\" to get suggestions of auto-completion.\n", - "- when inside the parentheses of a function, pressing \"shift\" and \"tab\" simultaneously will display a window with the signature of the function and a short documentation. Pressing it twice will expand the documentation and pressing it three times will open a full window with the same information at the bottom of your screen.\n", - "- in a cell, typing `?func_name` and executing will open a window with the signature of the function and a short documentation.\n", - "- in a cell, typing `??func_name` and executing will open a window with the signature of the function, a short documentation and the source code.\n", - "- if you are using the fastai library, we added a `doc` function for you: executing `doc(func_name)` in a cell will open a window with the signature of the function, a short documentation and links to the source code on GitHub and the full documentation of the function in the [documentation of the library](https://docs.fast.ai).\n", - "- unrelated to the documentation but still very useful: to get help at any point if you get an error, type `%debug` in the next cell and execute to open the [python debugger](https://docs.python.org/3/library/pdb.html) that will let you inspect the content of every variable." + "- At any point, if you don't remember the exact spelling of a function or argument name, you can press Tab to get autocompletion suggestions.\n", + "- When inside the parentheses of a function, pressing Shift and Tab simultaneously will display a window with the signature of the function and a short description. Pressing these keys twice will expand the documentation, and pressing them three times will open a full window with the same information at the bottom of your screen.\n", + "- In a cell, typing `?func_name` and executing will open a window with the signature of the function and a short description.\n", + "- In a cell, typing `??func_name` and executing will open a window with the signature of the function, a short description, and the source code.\n", + "- If you are using the fastai library, we added a `doc` function for you: executing `doc(func_name)` in a cell will open a window with the signature of the function, a short description and links to the source code on GitHub and the full documentation of the function in the [library docs](https://docs.fast.ai).\n", + "- Unrelated to the documentation but still very useful: to get help at any point if you get an error, type `%debug` in the next cell and execute to open the [Python debugger](https://docs.python.org/3/library/pdb.html), which will let you inspect the content of every variable." ] }, { @@ -581,7 +591,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "One thing to be aware of in this process: as we discussed in <>, models can only reflect the data used to train them. And the world is full of biased data, which ends up reflected in, for example, Bing Image Search (which we used to create our dataset). For instance, let's say you were interested in creating an app which could help users figure out whether they had healthy skin, so you trained a model on the results of searches for (say) *healthy skin*. <> shows you the results you would get." + "One thing to be aware of in this process: as we discussed in <>, models can only reflect the data used to train them. And the world is full of biased data, which ends up reflected in, for example, Bing Image Search (which we used to create our dataset). For instance, let's say you were interested in creating an app that could help users figure out whether they had healthy skin, so you trained a model on the results of searches for (say) \"healthy skin.\" <> shows you the kinds of results you would get." ] }, { @@ -595,7 +605,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "So with this as your training data, you would end up not with a healthy skin detector, but a *young white woman touching her face* detector! Be sure to think carefully about the types of data that you might expect to see in practice in your application, and check carefully to ensure that all these types are reflected in your model's source data. footnote:[Thanks to Deb Raji, who came up with the *healthy skin* example. See her paper *Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products* for more fascinating insights into model bias.]" + "With this as your training data, you would end up not with a healthy skin detector, but a *young white woman touching her face* detector! Be sure to think carefully about the types of data that you might expect to see in practice in your application, and check carefully to ensure that all these types are reflected in your model's source data. footnote:[Thanks to Deb Raji, who came up with the \"healthy skin\" example. See her paper [\"Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products\"](https://dl.acm.org/doi/10.1145/3306618.3314244) for more fascinating insights into model bias.]" ] }, { @@ -609,14 +619,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## From data to DataLoaders" + "## From Data to DataLoaders" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now that we have downloaded and verified the data that we want to use, we need to turn it into a `DataLoaders` object. `DataLoaders` is a thin class which just stores whatever `DataLoader` objects you pass to it, and makes them available as `train` and `valid` . Although it's a very simple class, it's very important in fastai: it provides the data for your model. The key functionality in `DataLoaders` is provided with just these 4 lines of code (it has some other minor functionality we'll skip over for now):\n", + "`DataLoaders` is a thin class that just stores whatever `DataLoader` objects you pass to it, and makes them available as `train` and `valid`. Although it's a very simple class, it's very important in fastai: it provides the data for your model. The key functionality in `DataLoaders` is provided with just these four lines of code (it has some other minor functionality we'll skip over for now):\n", "\n", "```python\n", "class DataLoaders(GetAttr):\n", @@ -630,23 +640,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: DataLoaders: A fastai class which stores whatever `DataLoader` objects you pass to it, and makes them available as properties." + "> jargon: DataLoaders: A fastai class that stores multiple `DataLoader` objects you pass to it, normally a `train` and a `valid`, although it's possible to have as many as you like. The first two are made available as properties." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "A `DataLoaders` object (i.e. the plural) stores multiple `DataLoader` objects, normally a `train` and a `valid`, although it's possible to have as many as you like. (Later in the book we'll also learn about the `Dataset` and `Datasets` classes, which have the same relationship).\n", + "Later in the book you'll also learn about the `Dataset` and `Datasets` classes, which have the same relationship.\n", "\n", - "To turn our downloaded data into `DataLoaders` we need to tell fastai at least four things:\n", + "To turn our downloaded data into a `DataLoaders` object we need to tell fastai at least four things:\n", "\n", - "- what kinds of data we are working with ;\n", - "- how to get the list of items ;\n", - "- how to label these items ;\n", - "- how to create the validation set.\n", + "- What kinds of data we are working with\n", + "- How to get the list of items\n", + "- How to label these items\n", + "- How to create the validation set\n", "\n", - "So far we have seen a number of *factory methods* for particular combinations of these things, which are convenient when you have an application and data structure which happen to fit into those predefined methods. For when you don't, fastai has an extremely flexible system called the *data block API*. With this API you can fully customize every stage of the creation of your DataLoaders. Here is what we need to create a DataLoaders for the dataset that we just downloaded:" + "So far we have seen a number of *factory methods* for particular combinations of these things, which are convenient when you have an application and data structure that happen to fit into those predefined methods. For when you don't, fastai has an extremely flexible system called the *data block API*. With this API you can fully customize every stage of the creation of your `DataLoaders`. Here is what we need to create a `DataLoaders` for the dataset that we just downloaded:" ] }, { @@ -658,7 +668,7 @@ "bears = DataBlock(\n", " blocks=(ImageBlock, CategoryBlock), \n", " get_items=get_image_files, \n", - " splitter=RandomSplitter(valid_pct=0.3, seed=42),\n", + " splitter=RandomSplitter(valid_pct=0.2, seed=42),\n", " get_y=parent_label,\n", " item_tfms=Resize(128))" ] @@ -667,19 +677,22 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's look at each of these sections in turn:\n", + "Let's look at each of these arguments in turn. First we provide a tuple where we specify what types we want for the independent and dependent variables: \n", "\n", "```python\n", "blocks=(ImageBlock, CategoryBlock)\n", "```\n", "\n", - "This is a tuple where we specify what types we want for the *independent* and *dependent* variables. The *independent variable* is the thing we are using to make predictions from, and the *dependent variable* is our target. In this case, our independent variable is a set of images, and our dependent variable are the categories (type of bear) for each image. We will see many other types of block in the rest of this book.\n", + "The *independent variable* is the thing we are using to make predictions from, and the *dependent variable* is our target. In this case, our independent variables are images, and our dependent variables are the categories (type of bear) for each image. We will see many other types of block in the rest of this book.\n", + "\n", + "For this `DataLoaders` our underlying items will be file paths. We have to tell fastai how to get a list of those files. The `get_image_files` function takes a path, and returns a list of all of the images in that path (recursively, by default):\n", "\n", "```python\n", "get_items=get_image_files\n", "```\n", "\n", - "For this DataLoaders our underlying items will be file paths. We have to tell fastai how to get a list of those files. The `get_image_files` function takes a path, and returns a list of all of the images in that path (recursively, by default).\n", + "Often, datasets that you download will already have a validation set defined. Sometimes this is done by placing the images for the training and validation sets into different folders. Sometimes it is done by providing a CSV file in which each filename is listed along with which dataset it should be in. There are many ways that this can be done, and fastai provides a very general approach that allows you to use one of its predefined classes for this, or to write your own. In this case, however, we simply want to split our training and validation sets randomly. However, we would like to have the same training/validation split each time we run this notebook, so we fix the random seed (computers don't really know how to create random numbers at all, but simply create lists of numbers that look random; if you provide the same starting point for that list each time—called the *seed*—then you will get the exact same list each time):\n", + "\n", "\n", "```python\n", "splitter=RandomSplitter(valid_pct=0.2, seed=42)\n", @@ -690,21 +703,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Often, datasets that you download will already have a validation set defined. Sometimes this is done by placing the images for the training and validation sets into different folders. Sometimes it is done by providing a CSV in which each file name is listed along with which dataset it should be in. There are many ways that this can be done, and fastai provides a very general approach which allows you to use one of fastai's predefined classes for this, or to write your own. In this case, however, we simply want to split our training and validation sets randomly. However, we would like to have the same training/validation split each time we run this notebook, so we fix the random seed. (Computers don't really know how to create random numbers at all, but simply create lists of numbers which look random. If you provide the same starting point for that list each time — called the *seed* — then you will get the exact same list each time.)\n", + "The independent variable is often referred to as `x` and the dependent variable is often referred to as `y`. Here, we are telling fastai what function to call to create the labels in our dataset:\n", "\n", "```python\n", "get_y=parent_label\n", "```\n", "\n", - "The independent variable is often referred to as \"x\" and the dependent variable is often referred to as \"y\". So in this section we are telling fastai what function to call to create the labels in our dataset. `parent_label` is a function provided by fastai which simply gets the name of the folder which a file is in. Because we put each of our bear images into folders based on the type of bear, this is going to give us the labels that we need.\n", + "`parent_label` is a function provided by fastai that simply gets the name of the folder a file is in. Because we put each of our bear images into folders based on the type of bear, this is going to give us the labels that we need.\n", + "\n", + "Our images are all different sizes, and this is a problem for deep learning: we don't feed the model one image at a time but several of them (what we call a *mini-batch*). To group them in a big array (usually called a *tensor*) that is going to go through our model, they all need to be of the same size. So, we need to add a transform which will resize these images to the same size. *Item transforms* are pieces of code that run on each individual item, whether it be an image, category, or so forth. fastai includes many predefined transforms; we use the `Resize` transform here:\n", "\n", "```python\n", "item_tfms=Resize(128)\n", "```\n", "\n", - "Our images are all different sizes, and this is a problem for deep learning: we don't feed the model one image at a time but several (what we call a *mini-batch*) of them. To group them in a big array (usually called *tensor*) that is going to go through our model, they all need to be of the same size. So we need to add a transform which will resize these images to the same size. *item transforms* are pieces of code which run on each individual item, whether it be an image, category, or so forth. fastai includes many predefined transforms; we will use the `Resize` transform here.\n", - "\n", - "This command has given us a `DataBlock` object. This is like a *template* for creating a `DataLoaders`. We still need to tell fastai the actual source of our data — in this case, the path where the images can be found." + "This command has given us a `DataBlock` object. This is like a *template* for creating a `DataLoaders`. We still need to tell fastai the actual source of our data—in this case, the path where the images can be found:" ] }, { @@ -720,7 +733,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "A DataLoaders includes validation and training `DataLoader`s. A `DataLoader` is a class which provides *batches* of a few items at a time to the GPU. We'll be learning a lot more about this class in the next chapter. When you loop through a `DataLoader` fastai will give you 64 (by default) items at a time, all stacked up into a single tensor. We can take a look at a few of those items by calling the `show_batch` method on a `DataLoader`:" + "A `DataLoaders` includes validation and training `DataLoader`s. `DataLoader` is a class that provides batches of a few items at a time to the GPU. We'll be learning a lot more about this class in the next chapter. When you loop through a `DataLoader` fastai will give you 64 (by default) items at a time, all stacked up into a single tensor. We can take a look at a few of those items by calling the `show_batch` method on a `DataLoader`:" ] }, { @@ -749,7 +762,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "By default `Resize` *crops* the images to fit a square shape of the size requested, using the full width or height. This can result in losing some important details. Alternatively, you can ask fastai to pad the images with zeros (which is black), or squish/stretch them:" + "By default `Resize` *crops* the images to fit a square shape of the size requested, using the full width or height. This can result in losing some important details. Alternatively, you can ask fastai to pad the images with zeros (black), or squish/stretch them:" ] }, { @@ -804,13 +817,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "All of these approaches seem somewhat wasteful, or problematic. If we squished or stretched the images then they end up as unrealistic shapes, leading to a model that learns that things look different to how they actually are, which we would expect to result in lower accuracy. If we crop the images then we remove some of the features that allow us to recognize them. For instance, if we were trying to recognise the breed of dog or cat, we may end up cropping out a key part of the body or the face necessary to distinguish between similar breeds. If we pad the images then we have a whole lot of empty space, which is just wasted computation for our model, and results in a lower effective resolution for the part of the image we actually use.\n", + "All of these approaches seem somewhat wasteful, or problematic. If we squish or stretch the images they end up as unrealistic shapes, leading to a model that learns that things look different to how they actually are, which we would expect to result in lower accuracy. If we crop the images then we remove some of the features that allow us to perform recognition. For instance, if we were trying to recognize breeds of dog or cat, we might end up cropping out a key part of the body or the face necessary to distinguish between similar breeds. If we pad the images then we have a whole lot of empty space, which is just wasted computation for our model and results in a lower effective resolution for the part of the image we actually use.\n", "\n", "Instead, what we normally do in practice is to randomly select part of the image, and crop to just that part. On each epoch (which is one complete pass through all of our images in the dataset) we randomly select a different part of each image. This means that our model can learn to focus on, and recognize, different features in our images. It also reflects how images work in the real world: different photos of the same thing may be framed in slightly different ways.\n", "\n", - "In fact, an entirely untrained neural network knows nothing whatsoever about how images behave. It doesn't even recognise that when an object is rotated by one degree, then it still is a picture of the same thing! So actually training the neural network with examples of images that are in slightly different places, and slightly different sizes, helps it to understand the basic concept of what an *object* is, and how it can be represented in an image.\n", + "In fact, an entirely untrained neural network knows nothing whatsoever about how images behave. It doesn't even recognize that when an object is rotated by one degree, it still is a picture of the same thing! So actually training the neural network with examples of images where the objects are in slightly different places and slightly different sizes helps it to understand the basic concept of what an object is, and how it can be represented in an image.\n", "\n", - "Here is another copy of the previous examples, but this time we are replacing `Resize` with `RandomResizedCrop`, which is the transform that provides the behaviour described above. The most important parameter to pass in is `min_scale`, which determines how much of the image to select at minimum each time." + "Here's another example where we replace `Resize` with `RandomResizedCrop`, which is the transform that provides the behavior we just described. The most important parameter to pass in is `min_scale`, which determines how much of the image to select at minimum each time:" ] }, { @@ -841,21 +854,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We use `unique=True` to have the same image repeated with different versions of this `RandomResizedCrop` transform. This is a specific example of a more general technique, called *data augmentation*." + "We used `unique=True` to have the same image repeated with different versions of this `RandomResizedCrop` transform. This is a specific example of a more general technique, called data augmentation." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Data augmentation" + "### Data Augmentation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Data augmentation refers to creating random variations of our input data, such that they appear different, but are not expected to change the meaning of the data. Examples of common data augmentation for images are rotation, flipping, perspective warping, brightness changes, contrast changes, and much more. For natural photo images such as the ones we are using here, there is a standard set of augmentations which we have found work pretty well, and are provided with the `aug_transforms` function. Because the images are now all the same size, we can apply these augmentations to an entire batch of them using the GPU, which will save a lot of time. To tell fastai we want to use these transforms on a batch, we use the `batch_tfms` parameter. (Note that we're not using `RandomResizedCrop` in this example, so you can see the differences more clearly; we're also using double the amount of augmentation compared to the default, for the same reason)." + "*Data augmentation* refers to creating random variations of our input data, such that they appear different, but do not actually change the meaning of the data. Examples of common data augmentation techniques for images are rotation, flipping, perspective warping, brightness changes and contrast changes. For natural photo images such as the ones we are using here, a standard set of augmentations that we have found work pretty well are provided with the `aug_transforms` function. Because our images are now all the same size, we can apply these augmentations to an entire batch of them using the GPU, which will save a lot of time. To tell fastai we want to use these transforms on a batch, we use the `batch_tfms` parameter (note that we're not using `RandomResizedCrop` in this example, so you can see the differences more clearly; we're also using double the amount of augmentation compared to the default, for the same reason):" ] }, { @@ -893,7 +906,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Training your model, and using it to clean your data" + "## Training Your Model, and Using It to Clean Your Data" ] }, { @@ -902,7 +915,7 @@ "source": [ "Time to use the same lines of code as in <> to train our bear classifier.\n", "\n", - "We don't have a lot of data for our problem (150 pictures of each sort of bear at most), so to train our model, we'll use `RandomResizedCrop` and default `aug_transforms` for our model, on an image size of 224px, which is fairly standard for image classification." + "We don't have a lot of data for our problem (150 pictures of each sort of bear at most), so to train our model, we'll use `RandomResizedCrop` with an image size of 224 px, which is fairly standard for image classification, and default `aug_transforms`:" ] }, { @@ -921,7 +934,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can now create our `Learner` and fine tune it in the usual way." + "We can now create our `Learner` and fine-tune it in the usual way:" ] }, { @@ -1022,7 +1035,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now let's see whether the mistakes the model is making are mainly thinking that grizzlies are teddies (that would be bad for safety!), or that grizzlies are black bears, or something else. We can create a *confusion matrix*:" + "Now let's see whether the mistakes the model is making are mainly thinking that grizzlies are teddies (that would be bad for safety!), or that grizzlies are black bears, or something else. To visualize this, we can create a *confusion matrix*:" ] }, { @@ -1062,11 +1075,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Each row here represents all the black, grizzly, and teddy bears in our dataset, respectively. Each column represents the images which the model predicted as black, grizzly, and teddy bears, respectively. Therefore, the diagonal of the matrix shows the images which were classified correctly, and the other, off diagonal, cells represent those which were classified incorrectly. This is called a *confusion matrix* and is one of the many ways that fastai allows you to view the results of your model. It is (of course!) calculated using the validation set. With the color coding, the goal is to have white everywhere, except the diagonal where we want dark blue. Our bear classifier isn't making many mistakes!\n", + "The rows represent all the black, grizzly, and teddy bears in our dataset, respectively. The columns represent the images which the model predicted as black, grizzly, and teddy bears, respectively. Therefore, the diagonal of the matrix shows the images which were classified correctly, and the off-diagonal cells represent those which were classified incorrectly. This is one of the many ways that fastai allows you to view the results of your model. It is (of course!) calculated using the validation set. With the color-coding, the goal is to have white everywhere except the diagonal, where we want dark blue. Our bear classifier isn't making many mistakes!\n", "\n", - "It's helpful to see where exactly our errors are occurring, to see whether it's due to a dataset problem (e.g. images that aren't bears at all, or are labelled incorrectly, etc.), or a model problem (e.g. perhaps it isn't handling images taken with unusual lighting, or from a different angle, etc.). To do this, we can sort our images by their *loss*.\n", + "It's helpful to see where exactly our errors are occurring, to see whether they're due to a dataset problem (e.g., images that aren't bears at all, or are labeled incorrectly, etc.), or a model problem (perhaps it isn't handling images taken with unusual lighting, or from a different angle, etc.). To do this, we can sort our images by their *loss*.\n", "\n", - "The *loss* is a number that is higher if the model is incorrect (and especially if it's also confident of its incorrect answer), or if it's correct, but not confident of its correct answer. In a couple chapters we'll learn in depth how loss is calculated and used in the training process. For now, `plot_top_losses` shows us the images with the highest loss in our dataset. As the title of the output says, each image is labeled with four things: prediction, actual (target label), loss, and probability. The *probability* here is the confidence level, from zero to one, that the model has assigned to its prediction." + "The loss is a number that is higher if the model is incorrect (especially if it's also confident of its incorrect answer), or if it's correct, but not confident of its correct answer. In a couple of chapters we'll learn in depth how loss is calculated and used in the training process. For now, `plot_top_losses` shows us the images with the highest loss in our dataset. As the title of the output says, each image is labeled with four things: prediction, actual (target label), loss, and probability. The *probability* here is the confidence level, from zero to one, that the model has assigned to its prediction:" ] }, { @@ -1095,11 +1108,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This output shows that the highest loss is an image that has been predicted as \"grizzly\" with high confidence. However, it's labeled (based on our Bing image search) as \"black\". We're not bear experts, but it sure looks to us like this label is incorrect! We should probably change its label to \"grizzly\".\n", + "This output shows that the image with the highest loss is one that has been predicted as \"grizzly\" with high confidence. However, it's labeled (based on our Bing image search) as \"black.\" We're not bear experts, but it sure looks to us like this label is incorrect! We should probably change its label to \"grizzly.\"\n", "\n", - "The intuitive approach to doing data cleaning is to do it *before* you train a model. But as you've seen in this case, a model can actually help you find data issues more quickly and easily. So we normally prefer to train a quick and simple model first, and then use it to help us with data cleaning.\n", + "The intuitive approach to doing data cleaning is to do it *before* you train a model. But as you've seen in this case, a model can actually help you find data issues more quickly and easily. So, we normally prefer to train a quick and simple model first, and then use it to help us with data cleaning.\n", "\n", - "fastai includes a handy GUI for data cleaning called `ImageClassifierCleaner` which allows you to choose a category, and training vs validation set, and view the highest-loss images (in order), along with menus to allow any images to be selected for removal or relabeling." + "fastai includes a handy GUI for data cleaning called `ImageClassifierCleaner` that allows you to choose a category and the training versus validation set and view the highest-loss images (in order), along with menus to allow images to be selected for removal or relabeling:" ] }, { @@ -1170,19 +1183,19 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see that amongst our *black bears* is an image that contains two bears: one grizzly, one black. So we should choose `` in the menu under this image. `ImageClassifierCleaner` doesn't actually do the deleting or changing of labels for you; it just returns the indices of items to change. So, for instance, to delete (`unlink`) all images selected for deletion, we would run:\n", + "We can see that amongst our \"black bears\" is an image that contains two bears: one grizzly, one black. So, we should choose `` in the menu under this image. `ImageClassifierCleaner` doesn't actually do the deleting or changing of labels for you; it just returns the indices of items to change. So, for instance, to delete (`unlink`) all images selected for deletion, we would run:\n", "\n", "```python\n", "for idx in cleaner.delete(): cleaner.fns[idx].unlink()\n", "```\n", "\n", - "To move images where we've selected a different category, we would run:\n", + "To move images for which we've selected a different category, we would run:\n", "\n", "```python\n", "for idx,cat in cleaner.change(): shutil.move(str(cleaner.fns[idx]), path/cat)\n", "```\n", "\n", - "> s: Cleaning the data and getting it ready for your model are two of the biggest challenges for data scientists; they say it takes 90% of their time. The fastai library aims at providing tools to make it as easy as possible.\n", + "> s: Cleaning the data and getting it ready for your model are two of the biggest challenges for data scientists; they say it takes 90% of their time. The fastai library aims to provide tools that make it as easy as possible.\n", "\n", "We'll be seeing more examples of model-driven data cleaning throughout this book. Once we've cleaned up our data, we can retrain our model. Try it yourself, and see if your accuracy improves!" ] @@ -1191,8 +1204,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n", - "> note: After cleaning the dataset using the above steps, we generally are seeing 100% accuracy on this task. We even see that result when we download a lot fewer images than the 150 per class we're using here. As you can see, the common complaint that _you need massive amounts of data to do deep learning_ can be a very long way from the truth!" + "> note: No Need for Big Data: After cleaning the dataset using these steps, we generally are seeing 100% accuracy on this task. We even see that result when we download a lot fewer images than the 150 per class we're using here. As you can see, the common complaint that _you need massive amounts of data to do deep learning_ can be a very long way from the truth!" ] }, { @@ -1206,7 +1218,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Turning your model into an online application" + "## Turning Your Model into an Online Application" ] }, { @@ -1220,18 +1232,18 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Using the model for inference" + "### Using the Model for Inference" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Once you've got a model you're happy with, you need to save it, so that you can then copy it over to a server where you'll use it in production. Remember that a model consists of two parts: the *architecture* and the trained *parameters*. The easiest way to save a model is to save both of these, because that way when you load a model you can be sure that you have the matching architecture and parameters. To save both parts, use the `export` method.\n", + "Once you've got a model you're happy with, you need to save it, so that you can then copy it over to a server where you'll use it in production. Remember that a model consists of two parts: the *architecture* and the trained *parameters*. The easiest way to save the model is to save both of these, because that way when you load a model you can be sure that you have the matching architecture and parameters. To save both parts, use the `export` method.\n", "\n", "This method even saves the definition of how to create your `DataLoaders`. This is important, because otherwise you would have to redefine how to transform your data in order to use your model in production. fastai automatically uses your validation set `DataLoader` for inference by default, so your data augmentation will not be applied, which is generally what you want.\n", "\n", - "When you call export, fastai will save a file called `export.pkl`." + "When you call `export`, fastai will save a file called \"export.pkl\":" ] }, { @@ -1247,7 +1259,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's check that the file exists, by using the `Path.ls` method that fastai adds to Python's `Path` class:" + "Let's check that the file exists, by using the `ls` method that fastai adds to Python's `Path` class:" ] }, { @@ -1357,21 +1369,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see here that if we index into the vocab with the integer returned by `predict` then we get back \"grizzly\", as expected. Also, note that if we index into the list of probabilities, we see a nearly 1.00 probability that this is a grizzly." + "We can see here that if we index into the vocab with the integer returned by `predict` then we get back \"grizzly,\" as expected. Also, note that if we index into the list of probabilities, we see a nearly 1.00 probability that this is a grizzly." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We know how to make predictions from our saved model, so we have everything we need to start building our app. We can do it directly in a Jupyter Notebook." + "We know how to make predictions from our saved model, so we have everything we need to start building our app. We can do it directly in a Jupyter notebook." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating a Notebook app from the model" + "### Creating a Notebook App from the Model" ] }, { @@ -1380,16 +1392,16 @@ "source": [ "To use our model in an application, we can simply treat the `predict` method as a regular function. Therefore, creating an app from the model can be done using any of the myriad of frameworks and techniques available to application developers.\n", "\n", - "However, most data scientists are not familiar with the world of web application development. So let's try using something that you do, at this point, know: Jupyter notebooks. It turns out that we can create a complete working web application using nothing but Jupyter notebooks! The two things we need to make this happen are:\n", + "However, most data scientists are not familiar with the world of web application development. So let's try using something that you do, at this point, know: it turns out that we can create a complete working web application using nothing but Jupyter notebooks! The two things we need to make this happen are:\n", "\n", "- IPython widgets (ipywidgets)\n", "- Voilà\n", "\n", "*IPython widgets* are GUI components that bring together JavaScript and Python functionality in a web browser, and can be created and used within a Jupyter notebook. For instance, the image cleaner that we saw earlier in this chapter is entirely written with IPython widgets. However, we don't want to require users of our application to run Jupyter themselves.\n", "\n", - "That is why *Voilà* exists. It is a system for making applications consisting of IPython widgets available to end-users, without them having to use Jupyter at all. Voilà is taking advantage of the fact that a notebook _already is_ a kind of web application, just a rather complex one that depends on another web application: Jupyter itself. Essentially, it helps us automatically convert the complex web application which we've already implicitly made (the notebook) into a simpler, easier-to-deploy web application, which functions like a normal web application rather than like a notebook.\n", + "That is why *Voilà* exists. It is a system for making applications consisting of IPython widgets available to end users, without them having to use Jupyter at all. Voilà is taking advantage of the fact that a notebook _already is_ a kind of web application, just a rather complex one that depends on another web application: Jupyter itself. Essentially, it helps us automatically convert the complex web application we've already implicitly made (the notebook) into a simpler, easier-to-deploy web application, which functions like a normal web application rather than like a notebook.\n", "\n", - "But we still have the advantage of developing in a notebook. So with ipywidgets, we can build up our GUI step by step. We will use this approach to create a simple image classifier. First, we need a file upload widget:" + "But we still have the advantage of developing in a notebook, so with ipywidgets, we can build up our GUI step by step. We will use this approach to create a simple image classifier. First, we need a file upload widget:" ] }, { @@ -1509,7 +1521,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and use a `Label` to display them:" + "and use a `Label` to display them:" ] }, { @@ -1545,7 +1557,7 @@ "source": [ "`Prediction: grizzly; Probability: 1.0000`\n", "\n", - "We'll need a button to do the classification; it looks exactly like the upload button." + "We'll need a button to do the classification. It looks exactly like the upload button:" ] }, { @@ -1578,7 +1590,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and a *click event handler*, that is, a function that will be called when it's pressed; we can just copy over the lines of code from above:" + "We'll also need a *click event handler*; that is, a function that will be called when it's pressed. We can just copy over the lines of code from above:" ] }, { @@ -1601,7 +1613,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can test the button now by pressing it, and you should see the image and predictions above update automatically!\n", + "You can test the button now by pressing it, and you should see the image and predictions update automatically!\n", "\n", "We can now put them all in a vertical box (`VBox`) to complete our GUI:" ] @@ -1661,7 +1673,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Turning your notebook into a real app" + "### Turning Your Notebook into a Real App" ] }, { @@ -1679,18 +1691,18 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now that we have everything working in this Jupyter notebook, we can create our application. To do this, create a notebook which contains only the code needed to create and show the widgets that you need, and markdown for any text that you want to appear. Have a look at the *bear_classifier* notebook in the book repo to see the simple notebook application we created.\n", + "Now that we have everything working in this Jupyter notebook, we can create our application. To do this, start a new notebook and add to it only the code needed to create and show the widgets that you need, and markdown for any text that you want to appear. Have a look at the *bear_classifier* notebook in the book's repo to see the simple notebook application we created.\n", "\n", - "Next, install Voilà if you have not already, by copying these lines into a Notebook cell, and executing it (if you're comfortable using the command line, you can also execute these two lines in your terminal, without the `!` prefix):\n", + "Next, install Voilà if you haven't already, by copying these lines into a notebook cell and executing it:\n", "\n", " !pip install voila\n", " !jupyter serverextension enable voila --sys-prefix\n", "\n", - "Cells which begin with a `!` do not contain Python code, but instead contain code which is passed to your shell, such as bash, power shell in windows, or so forth. If you are comfortable using the command line (which we'll be learning about later in this book), you can of course simply type these two lines (without the `!` prefix) directly into your terminal. In this case, the first line installs the voila library and application, and the second connects it to your existing Jupyter notebook.\n", + "Cells that begin with a `!` do not contain Python code, but instead contain code that is passed to your shell (bash, Windows PowerShell, etc.). If you are comfortable using the command line, which we'll discuss more later in this book, you can of course simply type these two lines (without the `!` prefix) directly into your terminal. In this case, the first line installs the `voila` library and application, and the second connects it to your existing Jupyter notebook.\n", "\n", - "Voilà runs Jupyter notebooks, just like the Jupyter notebook server you are using now does, except that it does something very important: it removes all of the cell inputs, and only shows output (including ipywidgets), along with your markdown cells. So what's left is a web application! To view your notebook as a voila web application, replace the word \"notebooks\" in your browser's URL with: \"voila/render\". You will see the same content as your notebook, but without any of the code cells.\n", + "Voilà runs Jupyter notebooks just like the Jupyter notebook server you are using now does, but it also does something very important: it removes all of the cell inputs, and only shows output (including ipywidgets), along with your markdown cells. So what's left is a web application! To view your notebook as a Voilà web application, replace the word \"notebooks\" in your browser's URL with: \"voila/render\". You will see the same content as your notebook, but without any of the code cells.\n", "\n", - "Of course, you don't need to use Voilà or ipywidgets. Your model is just a function you can call: `pred,pred_idx,probs = learn.predict(img)` . So you can use it with any framework, hosted on any platform. And you can take something you've prototyped in ipywidgets and Voilà and later convert it into a regular web application. We're showing you this approach in the book because we think it's a great way for data scientists and other folks that aren't web development experts to create applications from their models.\n", + "Of course, you don't need to use Voilà or ipywidgets. Your model is just a function you can call (`pred,pred_idx,probs = learn.predict(img)`), so you can use it with any framework, hosted on any platform. And you can take something you've prototyped in ipywidgets and Voilà and later convert it into a regular web application. We're showing you this approach in the book because we think it's a great way for data scientists and other folks that aren't web development experts to create applications from their models.\n", "\n", "We have our app, now let's deploy it!" ] @@ -1706,28 +1718,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we now know, you need a GPU to train nearly any useful deep learning model. So, do you need a GPU to use that model in production? No! You almost certainly **do not need a GPU to serve your model in production**. There are a few reasons for this:\n", + "As you now know, you need a GPU to train nearly any useful deep learning model. So, do you need a GPU to use that model in production? No! You almost certainly *do not need a GPU to serve your model in production*. There are a few reasons for this:\n", "\n", - "- As we've seen, GPUs are only useful when they do lots of identical work in parallel. If you're doing (say) image classification, then you'll normally be classifying just one user's image at a time, and there isn't normally enough work to do in a single image to keep a GPU busy for long enough for it to be very efficient. So a CPU will often be more cost effective.\n", - "- An alternative could be to wait for a few users to submit their images, and then batch them up, and do them all at once on a GPU. But then you're asking your users to wait, rather than getting answers straight away! And you need a high volume site for this to be workable. If you do need this functionality, you can use a tool such as Microsoft's [ONNX Runtime](https://github.com/microsoft/onnxruntime), or [AWS Sagemaker](https://aws.amazon.com/sagemaker/)\n", - "- The complexities of dealing with GPU inference are significant. In particular, the GPU's memory will need careful manual management, and you'll need some careful queueing system to ensure you only do one batch at a time.\n", - "- There's a lot more market competition in CPU servers than GPU, as a result of which there are much cheaper options available for CPU servers.\n", + "- As we've seen, GPUs are only useful when they do lots of identical work in parallel. If you're doing (say) image classification, then you'll normally be classifying just one user's image at a time, and there isn't normally enough work to do in a single image to keep a GPU busy for long enough for it to be very efficient. So, a CPU will often be more cost-effective.\n", + "- An alternative could be to wait for a few users to submit their images, and then batch them up and process them all at once on a GPU. But then you're asking your users to wait, rather than getting answers straight away! And you need a high-volume site for this to be workable. If you do need this functionality, you can use a tool such as Microsoft's [ONNX Runtime](https://github.com/microsoft/onnxruntime), or [AWS Sagemaker](https://aws.amazon.com/sagemaker/)\n", + "- The complexities of dealing with GPU inference are significant. In particular, the GPU's memory will need careful manual management, and you'll need a careful queueing system to ensure you only process one batch at a time.\n", + "- There's a lot more market competition in CPU than GPU servers, as a result of which there are much cheaper options available for CPU servers.\n", "\n", - "Because of the complexity of GPU serving, many systems have sprung up to try to automate this. However, managing and running these systems is also complex, and generally requires compiling your model into a different form that's specialized for that system. It doesn't make sense to deal with this complexity until/unless your app gets popular enough that it makes clear financial sense for you to do so." + "Because of the complexity of GPU serving, many systems have sprung up to try to automate this. However, managing and running these systems is also complex, and generally requires compiling your model into a different form that's specialized for that system. It's typically preferable to avoid dealing with this complexity until/unless your app gets popular enough that it makes clear financial sense for you to do so." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "For at least the initial prototype of your application, and for any hobby projects that you want to show off, you can easily host them for free. The best place and the best way to do this will vary over time so check the book website for the most up-to-date recommendations. As we're writing this book in 2020 the simplest (and free!) approach is called [Binder](https://mybinder.org/). To publish your web app on Binder, you follow these steps:\n", + "For at least the initial prototype of your application, and for any hobby projects that you want to show off, you can easily host them for free. The best place and the best way to do this will vary over time, so check the book's website for the most up-to-date recommendations. As we're writing this book in early 2020 the simplest (and free!) approach is to use [Binder](https://mybinder.org/). To publish your web app on Binder, you follow these steps:\n", "\n", - "1. Add your notebook to a [GitHub repository](http://github.com/), \n", - "2. Paste the URL of that repo in the URL field of Binder as shown in <>, \n", - "3. Change the \"File\" dropdown to instead select \"URL\",\n", - "4. In the Path field, enter `/voila/render/name.ipynb` (replacing `name.ipynb` as appropriate for your notebook):\n", - "5. Click the \"Copy the URL\" button and paste it somewhere safe. \n", - "6. Click \"Launch\"." + "1. Add your notebook to a [GitHub repository](http://github.com/).\n", + "2. Paste the URL of that repo into Binder's URL, as shown in <>.\n", + "3. Change the File dropdown to instead select URL.\n", + "4. In the \"URL to open\" field, enter `/voila/render/name.ipynb` (replacing `name` with the name of for your notebook).\n", + "5. Click the clickboard button at the bottom right to copyt the URL and paste it somewhere safe. \n", + "6. Click Launch." ] }, { @@ -1741,29 +1753,29 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The first time you do this, Binder will take around 5 minutes to build your site. In other words, it is finding a virtual machine which can run your app, allocating storage, collecting the files needed for Jupyter, for your notebook, and for presenting your notebook as a web application. It's doing all of this behind the scenes.\n", + "The first time you do this, Binder will take around 5 minutes to build your site. Behind the scenes, it is finding a virtual machine that can run your app, allocating storage, collecting the files needed for Jupyter, for your notebook, and for presenting your notebook as a web application.\n", "\n", "Finally, once it has started the app running, it will navigate your browser to your new web app. You can share the URL you copied to allow others to access your app as well.\n", "\n", - "For other (both free and paid) options for deploying your web app, be sure to take a look at the book web site." + "For other (both free and paid) options for deploying your web app, be sure to take a look at the [book's website](https://book.fast.ai/)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "You may well want to deploy your application onto mobile devices, or edge devices such as a Raspberry Pi. There are a lot of libraries and frameworks to allow you to integrate a model directly into a mobile application. However, these approaches tend to require a lot of extra steps and boilerplate, and do not always support all the PyTorch and fastai layers that your model might use. In addition, the work you do will depend on what kind of mobile devices you are targeting for deployment. So you might need to do some work to run on iOS devices, different work to run on newer Android devices, different work for older Android devices, etc. Instead, we recommend wherever possible that you deploy the model itself to a server, and have your mobile or edge application connect to it as a web service.\n", + "You may well want to deploy your application onto mobile devices, or edge devices such as a Raspberry Pi. There are a lot of libraries and frameworks that allow you to integrate a model directly into a mobile application. However, these approaches tend to require a lot of extra steps and boilerplate, and do not always support all the PyTorch and fastai layers that your model might use. In addition, the work you do will depend on what kind of mobile devices you are targeting for deployment--you might need to do some work to run on iOS devices, different work to run on newer Android devices, different work for older Android devices, etc. Instead, we recommend wherever possible that you deploy the model itself to a server, and have your mobile or edge application connect to it as a web service.\n", "\n", - "There are quite a few upsides to this approach. The initial installation is easier, because you only have to deploy a small GUI application, which connects to the server to do all the heavy lifting. More importantly perhaps, upgrades of that core logic can happen on your server, rather than needing to be distributed to all of your users. Your server can have a lot more memory and processing capacity than most edge devices, and it is far easier to scale those resources if your model becomes more demanding. The hardware that you will have on a server is going to be more standard and more easily supported by fastai and PyTorch, so you don't have to compile your model into a different form.\n", + "There are quite a few upsides to this approach. The initial installation is easier, because you only have to deploy a small GUI application, which connects to the server to do all the heavy lifting. More importantly perhaps, upgrades of that core logic can happen on your server, rather than needing to be distributed to all of your users. Your server will have a lot more memory and processing capacity than most edge devices, and it is far easier to scale those resources if your model becomes more demanding. The hardware that you will have on a server is also going to be more standard and more easily supported by fastai and PyTorch, so you don't have to compile your model into a different form.\n", "\n", - "There are downsides too, of course. Your application will require a network connection, and there will be some latency each time the model is called. It takes a while for a neural network model to run anyway, so this additional network latency may not make a big difference to your users in practice. In fact, since you can use better hardware on the server, the overall latency may even be less! If your application uses sensitive data then your users may be concerned about an approach which sends that data to a remote server, so sometimes privacy considerations will mean that you need to run the model on the edge device. Sometimes this can be avoided by having an *on premise* server, such as inside a company's firewall. Managing the complexity and scaling the server can create additional overhead, whereas if your model runs on the edge devices then each user is bringing their own compute resources, which leads to easier scaling with an increasing number of users (also known as *horizontal scaling*)." + "There are downsides too, of course. Your application will require a network connection, and there will be some latency each time the model is called. (It takes a while for a neural network model to run anyway, so this additional network latency may not make a big difference to your users in practice. In fact, since you can use better hardware on the server, the overall latency may even be less than if it were running locally!) Also, if your application uses sensitive data then your users may be concerned about an approach which sends that data to a remote server, so sometimes privacy considerations will mean that you need to run the model on the edge device (it may be possible to avoid this by having an *on-premise* server, such as inside a company's firewall). Managing the complexity and scaling the server can create additional overhead too, whereas if your model runs on the edge devices then each user is bringing their own compute resources, which leads to easier scaling with an increasing number of users (also known as *horizontal scaling*)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> A: I've had a chance to see up close how the mobile ML landscape is changing in my work. We offer an iPhone app that depends on computer vision and for years we ran our own computer vision models in the cloud. This was the only way to do it then since those models needed significant memory and compute resources and took minutes to process. This approach required building not only the models (fun!) but also the infrastructure to ensure a certain number of \"compute worker machines\" was absolutely always running (scary), that more machines would automatically come online if traffic increased, that there was stable storage for large inputs and outputs, that the iOS app could know and tell the user how their job was doing, etc... Nowadays, Apple provides APIs for converting models to run efficiently on device and most iOS devices have dedicated ML hardware, and we run our newer models on device. So, in a few years that strategy has gone from impossible to possible but it's still not easy. In our case it's worth it, for a faster user experience and to worry less about servers. What works for you will depend, realistically, on the user experience you're trying to create and what you personally find is easy to do. If you really know how to run servers, do it. If you really know how to build native mobile apps, do that. There are many roads up the hill.\n", + "> A: I've had a chance to see up close how the mobile ML landscape is changing in my work. We offer an iPhone app that depends on computer vision, and for years we ran our own computer vision models in the cloud. This was the only way to do it then since those models needed significant memory and compute resources and took minutes to process inputs. This approach required building not only the models (fun!) but also the infrastructure to ensure a certain number of \"compute worker machines\" were absolutely always running (scary), that more machines would automatically come online if traffic increased, that there was stable storage for large inputs and outputs, that the iOS app could know and tell the user how their job was doing, etc. Nowadays Apple provides APIs for converting models to run efficiently on device and most iOS devices have dedicated ML hardware, so that's the strategy we use for our newer models. It's still not easy but in our case it's worth it, for a faster user experience and to worry less about servers. What works for you will depend, realistically, on the user experience you're trying to create and what you personally find is easy to do. If you really know how to run servers, do it. If you really know how to build native mobile apps, do that. There are many roads up the hill.\n", "\n", "Overall, we'd recommend using a simple CPU-based server approach where possible, for as long as you can get away with it. If you're lucky enough to have a very successful application, then you'll be able to justify the investment in more complex deployment approaches at that time.\n", "\n", @@ -1774,37 +1786,37 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## How to avoid disaster" + "## How to Avoid Disaster" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In practice, a deep learning model will be just one piece of a much bigger system. As we discussed at the start of this chapter, a *data product* requires thinking about the entire end-to-end process within which our model lives. In this book, we can't hope to cover all the complexity of managing deployed data products, such as managing multiple versions of models, A/B testing, canarying, refreshing the data (should we just grow and grow our datasets all the time, or should we regularly remove some of the old data), handling data labelling, monitoring all this, detecting model rot, and so forth. However, there is an excellent book that covers many deployment issues, which is [Building Machine Learning Powered Applications](https://www.amazon.com/Building-Machine-Learning-Powered-Applications/dp/149204511X), by Emmanuel Ameisen. In this section, we will give an overview of some of the most important issues to consider.\n", + "In practice, a deep learning model will be just one piece of a much bigger system. As we discussed at the start of this chapter, a data product requires thinking about the entire end-to-end process, from conception to use in production. In this book, we can't hope to cover all the complexity of managing deployed data products, such as managing multiple versions of models, A/B testing, canarying, refreshing the data (should we just grow and grow our datasets all the time, or should we regularly remove some of the old data?), handling data labeling, monitoring all this, detecting model rot, and so forth. In this section we will give an overview of some of the most important issues to consider; for a more detailed discussion of deployment issues we refer to you to the excellent [Building Machine Learning Powered Applications](http://shop.oreilly.com/product/0636920215912.do) by Emmanuel Ameisen (O'Reilly)\n", "\n", - "One of the biggest issues to consider is that understanding and testing the behavior of a deep learning model is much more difficult than most code that you would write. With normal software development you can analyse the exact steps that the software is taking, and carefully study which of these steps match the desired behaviour that you are trying to create. But with a neural network the behavior emerges from the model's attempt to match the training data, rather than being exactly defined.\n", + "One of the biggest issues to consider is that understanding and testing the behavior of a deep learning model is much more difficult than with most other code you write. With normal software development you can analyze the exact steps that the software is taking, and carefully study which of these steps match the desired behavior that you are trying to create. But with a neural network the behavior emerges from the model's attempt to match the training data, rather than being exactly defined.\n", "\n", - "This can result in disaster! For instance, let's say you really were rolling out a bear detection system which will be attached to video cameras around the campsite, and will warn campers of incoming bears. If we used a model trained with the dataset we downloaded, there are going to be all kinds of problems in practice, such as:\n", + "This can result in disaster! For instance, let's say we really were rolling out a bear detection system that will be attached to video cameras around campsites in national parks, and will warn campers of incoming bears. If we used a model trained with the dataset we downloaded there would be all kinds of problems in practice, such as:\n", "\n", - "- working with video data instead of images ;\n", - "- handling nighttime images, which may not appear in this dataset ;\n", - "- dealing with low resolution camera images ;\n", - "- ensuring results are returned fast enough to be useful in practice ;\n", - "- recognising bears in positions that are rarely seen in photos that people post online (for example from behind, partially covered by bushes, or when a long way away from the camera)." + "- Working with video data instead of images\n", + "- Handling nighttime images, which may not appear in this dataset\n", + "- Dealing with low-resolution camera images\n", + "- Ensuring results are returned fast enough to be useful in practice\n", + "- Recognizing bears in positions that are rarely seen in photos that people post online (for example from behind, partially covered by bushes, or when a long way away from the camera)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "A big part of the issue is that the kinds of photos that people are most likely to upload to the Internet are the kinds of photos that do a good job of clearly and artistically displaying their subject matter. So we may need to do a lot of our own data collection and labelling to create a useful system.\n", + "A big part of the issue is that the kinds of photos that people are most likely to upload to the internet are the kinds of photos that do a good job of clearly and artistically displaying their subject matter--which isn't the kind of input this system is going to be getting. So, we may need to do a lot of our own data collection and labelling to create a useful system.\n", "\n", - "This is just one example of the more general problem of *out of domain* data. That is to say, there may be data that our model sees in production which is very different to what it saw during training. There isn't really a complete technical solution to this problem; instead we have to be careful about our approach to rolling out the technology.\n", + "This is just one example of the more general problem of *out-of-domain* data. That is to say, there may be data that our model sees in production which is very different to what it saw during training. There isn't really a complete technical solution to this problem; instead, we have to be careful about our approach to rolling out the technology.\n", "\n", - "There are other reasons we need to be careful too. One very common problem is *domain shift*; this is where the type of data that our model sees changes over time. For instance, an insurance company may use a deep learning model as part of their pricing and risk algorithm, but over time the type of customers that they attract, and the type of risks that they represent, may change so much that the original training data is no longer relevant.\n", + "There are other reasons we need to be careful too. One very common problem is *domain shift*, where the type of data that our model sees changes over time. For instance, an insurance company may use a deep learning model as part of its pricing and risk algorithm, but over time the types of customers that the company attracts, and the types of risks they represent, may change so much that the original training data is no longer relevant.\n", "\n", - "Out of domain data, and domain shift, are examples of the problem that you can never fully know the entire behaviour of your neural network. They have far too many parameters to be able to analytically understand all of their possible behaviours. This is the natural downside of the thing that they're so good at — their flexibility in being able to solve complex problems where we may not even be able to fully specify our preferred solution approaches. The good news, however, is that there are ways to mitigate these risks using a carefully thought out process. The details of this will vary depending on the details of the problem you are solving, but we will attempt to lay out here a high-level approach summarized in <> which we hope will provide useful guidance." + "Out-of-domain data and domain shift are examples of a larger problem: that you can never fully understand the entire behaviour of your neural network. They have far too many parameters to be able to analytically understand all of their possible behaviors. This is the natural downside of their best feature—their flexibility, which enables them to solve complex problems where we may not even be able to fully specify our preferred solution approaches. The good news, however, is that there are ways to mitigate these risks using a carefully thought-out process. The details of this will vary depending on the details of the problem you are solving, but we will attempt to lay out here a high-level approach, summarized in <>, which we hope will provide useful guidance." ] }, { @@ -1818,54 +1830,54 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Where possible, the first step is to use an entirely manual process, with your deep learning model approach running in parallel, but not being used directly to drive any actions. The humans involved in the manual process should look at the deep learning outputs and check whether they make sense. For instance, with our bear classifier a park ranger could have a screen displaying any time a possible bear sighting occurred in any camera, and simply highlight them in red on the screen. The park ranger would still be expected to be just as alert as before the model was deployed; the model is simply helping to check for problems at this point.\n", + "Where possible, the first step is to use an entirely manual process, with your deep learning model approach running in parallel but not being used directly to drive any actions. The humans involved in the manual process should look at the deep learning outputs and check whether they make sense. For instance, with our bear classifier a park ranger could have a screen displaying video feeds from all the cameras, with any possible bear sightings simply highlighted in red. The park ranger would still be expected to be just as alert as before the model was deployed; the model is simply helping to check for problems at this point.\n", "\n", - "The second step is to try to limit the scope of the model, and have it carefully supervised by people. For instance, do a small geographically and time constrained trial of the model-driven approach. Rather than rolling your bear classifier out in every national park throughout the country, pick a single observation post, for a one-week period, and have a park ranger check each alert before it goes out.\n", + "The second step is to try to limit the scope of the model, and have it carefully supervised by people. For instance, do a small geographically and time-constrained trial of the model-driven approach. Rather than rolling our bear classifier out in every national park throughout the country, we could pick a single observation post, for a one-week period, and have a park ranger check each alert before it goes out.\n", "\n", - "Then, gradually increase the scope of your rollout. As you do so, ensure that you have really good reporting systems in place, to make sure that you are aware of any significant changes to the actions being taken compared to your manual process. For instance, if the number of bear alerts doubles or halves after rollout of the new system in some location we should be very concerned. Try to think about all the ways in which your system could go wrong, and then think about what measure or report or picture could reflect that problem, and then ensure that your regular reporting includes that information." + "Then, gradually increase the scope of your rollout. As you do so, ensure that you have really good reporting systems in place, to make sure that you are aware of any significant changes to the actions being taken compared to your manual process. For instance, if the number of bear alerts doubles or halves after rollout of the new system in some location, we should be very concerned. Try to think about all the ways in which your system could go wrong, and then think about what measure or report or picture could reflect that problem, and ensure that your regular reporting includes that information." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> J: I started a company 20 years ago called _Optimal Decisions_ which used machine learning and optimisation to help giant insurance companies set their pricing, impacting tens of billions of dollars of risks. We used the approaches described above to manage the potential downsides of something that might go wrong. Also, before we worked with our clients to put anything in production, we tried to simulate the impact by testing the end-to-end system on their previous year's data. It was always quite a nerve-wracking process, putting these new algorithms in production, but every rollout was successful." + "> J: I started a company 20 years ago called _Optimal Decisions_ that used machine learning and optimization to help giant insurance companies set their pricing, impacting tens of billions of dollars of risks. We used the approaches described here to manage the potential downsides of something going wrong. Also, before we worked with our clients to put anything in production, we tried to simulate the impact by testing the end-to-end system on their previous year's data. It was always quite a nerve-wracking process, putting these new algorithms into production, but every rollout was successful." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Unforeseen consequences and feedback loops" + "### Unforeseen Consequences and Feedback Loops" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "One of the biggest challenges in rolling out a model is that your model may change the behaviour of the system it is a part of. For instance, consider a \"predictive policing\" algorithm that predicts more crime in certain neighborhoods, causing more police officers to be sent to those neighborhoods, which can result in more crime being recorded in those neighborhoods, and so on. In the Royal Statistical Society paper [To predict and serve](https://rss.onlinelibrary.wiley.com/doi/full/10.1111/j.1740-9713.2016.00960.x), Kristian Lum and William Isaac write: \"predictive policing is aptly named: it is predicting future policing, not future crime\".\n", + "One of the biggest challenges in rolling out a model is that your model may change the behaviour of the system it is a part of. For instance, consider a \"predictive policing\" algorithm that predicts more crime in certain neighborhoods, causing more police officers to be sent to those neighborhoods, which can result in more crimes being recorded in those neighborhoods, and so on. In the Royal Statistical Society paper [\"To Predict and Serve?\"](https://rss.onlinelibrary.wiley.com/doi/full/10.1111/j.1740-9713.2016.00960.x), Kristian Lum and William Isaac observe that: \"predictive policing is aptly named: it is predicting future policing, not future crime.\"\n", "\n", - "Part of the issue in this case is that in the presence of *bias* (which we'll discuss in depth in the next chapter), feedback loops can result in negative implications of that bias getting worse and worse. For instance, there are concerns that this is already happening in the US, where there is significant bias in arrest rates on racial grounds. [According to the ACLU](https://www.aclu.org/issues/smart-justice/sentencing-reform/war-marijuana-black-and-white), \"despite roughly equal usage rates, Blacks are 3.73 times more likely than whites to be arrested for marijuana\". The impact of this bias, along with the roll-out of predictive policing algorithms in many parts of the US, led Bärí Williams to [write in the NY Times](https://www.nytimes.com/2017/12/02/opinion/sunday/intelligent-policing-and-my-innocent-children.html): \"The same technology that’s the source of so much excitement in my career is being used in law enforcement in ways that could mean that in the coming years, my son, who is 7 now, is more likely to be profiled or arrested — or worse — for no reason other than his race and where we live.\"\n", + "Part of the issue in this case is that in the presence of bias (which we'll discuss in depth in the next chapter), *feedback loops* can result in negative implications of that bias getting worse and worse. For instance, there are concerns that this is already happening in the US, where there is significant bias in arrest rates on racial grounds. [According to the ACLU](https://www.aclu.org/issues/smart-justice/sentencing-reform/war-marijuana-black-and-white), \"despite roughly equal usage rates, Blacks are 3.73 times more likely than whites to be arrested for marijuana.\" The impact of this bias, along with the rollout of predictive policing algorithms in many parts of the US, led Bärí Williams to [write in the *New York Times*](https://www.nytimes.com/2017/12/02/opinion/sunday/intelligent-policing-and-my-innocent-children.html): \"The same technology that’s the source of so much excitement in my career is being used in law enforcement in ways that could mean that in the coming years, my son, who is 7 now, is more likely to be profiled or arrested—or worse—for no reason other than his race and where we live.\"\n", "\n", - "A helpful exercise prior to rolling out a significant machine learning system is to consider this question: \"what would happen if it went really, really well?\" In other words, what if the predictive power was extremely high, and its ability to influence behaviour was extremely significant? In that case, who would be most impacted? What would the most extreme results potentially look like? How would you know what was really going on?\n", + "A helpful exercise prior to rolling out a significant machine learning system is to consider this question: \"What would happen if it went really, really well?\" In other words, what if the predictive power was extremely high, and its ability to influence behavior was extremely significant? In that case, who would be most impacted? What would the most extreme results potentially look like? How would you know what was really going on?\n", "\n", - "Such a thought exercise might help you to construct a more careful rollout plan, ongoing monitoring systems, and human oversight. Of course, human oversight isn't useful if it isn't listened to; so make sure that there are reliable and resilient communication channels so that the right people will be aware of issues, and will have the power to fix them." + "Such a thought exercise might help you to construct a more careful rollout plan, with ongoing monitoring systems and human oversight. Of course, human oversight isn't useful if it isn't listened to, so make sure that there are reliable and resilient communication channels so that the right people will be aware of issues, and will have the power to fix them." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Get writing!" + "## Get Writing!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "One of the things our students have found most helpful to solidify their understanding of this material is to write it down. There is no better test of your understanding of a topic than attempting to teach it to somebody else. This is helpful even if you never show your writing to anybody — but it's even better if you share it! So we recommend that, if you haven't already, you start a blog. Now that you've finished chapter 2, and have learned how to train and deploy models, you're well placed to write your first blog post about your deep learning journey. What's surprised you? What opportunities do you see for deep learning in your field? What obstacles do you see?\n", + "One of the things our students have found most helpful to solidify their understanding of this material is to write it down. There is no better test of your understanding of a topic than attempting to teach it to somebody else. This is helpful even if you never show your writing to anybody—but it's even better if you share it! So we recommend that, if you haven't already, you start a blog. Now that you've completed Chapter 2 and have learned how to train and deploy models, you're well placed to write your first blog post about your deep learning journey. What's surprised you? What opportunities do you see for deep learning in your field? What obstacles do you see?\n", "\n", - "Rachel Thomas, co-founder of fast.ai, wrote in the article [Why you (yes, you) should blog](https://medium.com/@racheltho/why-you-yes-you-should-blog-7d2544ac1045):\n", + "Rachel Thomas, cofounder of fast.ai, wrote in the article [\"Why You (Yes, You) Should Blog\"](https://medium.com/@racheltho/why-you-yes-you-should-blog-7d2544ac1045):\n", "\n", "```asciidoc\n", "____\n", @@ -1879,9 +1891,11 @@ "____\n", "```\n", "\n", - "Perhaps her most important tip is this: \"*You are best positioned to help people one step behind you. The material is still fresh in your mind. Many experts have forgotten what it was like to be a beginner (or an intermediate) and have forgotten why the topic is hard to understand when you first hear it. The context of your particular background, your particular style, and your knowledge level will give a different twist to what you’re writing about*.\"\n", + "Perhaps her most important tip is this: \n", "\n", - "We've provided full details on how to set up a blog in an appendix \"_Creating a blog_\". If you don't have a blog already, jump over to that chapter now, because we've got a really great approach set up for you to start blogging, for free, with no ads--and you can even use Jupyter Notebook!" + "> : You are best positioned to help people one step behind you. The material is still fresh in your mind. Many experts have forgotten what it was like to be a beginner (or an intermediate) and have forgotten why the topic is hard to understand when you first hear it. The context of your particular background, your particular style, and your knowledge level will give a different twist to what you’re writing about.\n", + "\n", + "We've provided full details on how to set up a blog in <>. If you don't have a blog already, take a look at that now, because we've got a really great approach set up for you to start blogging for free, with no ads--and you can even use Jupyter Notebook!" ] }, { @@ -1895,21 +1909,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "1. Provide an example of where the bear classification model might work poorly, due to structural or style differences to the training data.\n", + "1. Provide an example of where the bear classification model might work poorly in production, due to structural or style differences in the training data.\n", "1. Where do text models currently have a major deficiency?\n", "1. What are possible negative societal implications of text generation models?\n", "1. In situations where a model might make mistakes, and those mistakes could be harmful, what is a good alternative to automating a process?\n", "1. What kind of tabular data is deep learning particularly good at?\n", "1. What's a key downside of directly using a deep learning model for recommendation systems?\n", - "1. What are the steps of the Drivetrain approach?\n", - "1. How do the steps of the Drivetrain approach map to a recommendation system?\n", + "1. What are the steps of the Drivetrain Approach?\n", + "1. How do the steps of the Drivetrain Approach map to a recommendation system?\n", "1. Create an image recognition model using data you curate, and deploy it on the web.\n", "1. What is `DataLoaders`?\n", "1. What four things do we need to tell fastai to create `DataLoaders`?\n", "1. What does the `splitter` parameter to `DataBlock` do?\n", "1. How do we ensure a random split always gives the same validation set?\n", "1. What letters are often used to signify the independent and dependent variables?\n", - "1. What's the difference between crop, pad, and squish resize approaches? When might you choose one over the other?\n", + "1. What's the difference between the crop, pad, and squish resize approaches? When might you choose one over the others?\n", "1. What is data augmentation? Why is it needed?\n", "1. What is the difference between `item_tfms` and `batch_tfms`?\n", "1. What is a confusion matrix?\n", @@ -1918,27 +1932,27 @@ "1. What are IPython widgets?\n", "1. When might you want to use CPU for deployment? When might GPU be better?\n", "1. What are the downsides of deploying your app to a server, instead of to a client (or edge) device such as a phone or PC?\n", - "1. What are 3 examples of problems that could occur when rolling out a bear warning system in practice?\n", - "1. What is \"out of domain data\"?\n", + "1. What are three examples of problems that could occur when rolling out a bear warning system in practice?\n", + "1. What is \"out-of-domain data\"?\n", "1. What is \"domain shift\"?\n", - "1. What are the 3 steps in the deployment process?\n", - "1. For a project you're interested in applying deep learning to, consider the thought experiment \"what would happen if it went really, really well?\"\n", - "1. Start a blog, and write your first blog post. For instance, write about what you think deep learning might be useful for in a domain you're interested in." + "1. What are the three steps in the deployment process?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "1. Consider how the Drivetrain approach maps to a project or problem you're interested in.\n", - "1. When might it be best to avoid certain types of data augmentation?" + "1. Consider how the Drivetrain Approach maps to a project or problem you're interested in.\n", + "1. When might it be best to avoid certain types of data augmentation?\n", + "1. For a project you're interested in applying deep learning to, consider the thought experiment \"What would happen if it went really, really well?\"\n", + "1. Start a blog, and write your first blog post. For instance, write about what you think deep learning might be useful for in a domain you're interested in." ] }, { diff --git a/03_ethics.ipynb b/03_ethics.ipynb index 5479ac5ef..509dcf6df 100644 --- a/03_ethics.ipynb +++ b/03_ethics.ipynb @@ -18,14 +18,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Acknowledgement: Dr Rachel Thomas" + "### Sidebar: Acknowledgement: Dr. Rachel Thomas" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This chapter was co-authored by Dr Rachel Thomas, the co-founder of fast.ai, and founding director of the Center for Applied Data Ethics at the University of San Francisco. It largely follows a subset of the syllabus she developed for the [Introduction to Data Ethics](https://ethics.fast.ai) course." + "This chapter was co-authored by Dr. Rachel Thomas, the cofounder of fast.ai, and founding director of the Center for Applied Data Ethics at the University of San Francisco. It largely follows a subset of the syllabus she developed for the [Introduction to Data Ethics](https://ethics.fast.ai) course." ] }, { @@ -39,9 +39,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we discussed in Chapters 1 and 2, sometimes, machine learning models can go wrong. They can have bugs. They can be presented with data that they haven't seen before, and behave in ways we don't expect. Or, they could work exactly as designed, but be used for something that you would much prefer they were never ever used for.\n", + "As we discussed in Chapters 1 and 2, sometimes machine learning models can go wrong. They can have bugs. They can be presented with data that they haven't seen before, and behave in ways we don't expect. Or they could work exactly as designed, but be used for something that we would much prefer they were never, ever used for.\n", "\n", - "Because deep learning is such a powerful tool and can be used for so many things, it becomes particularly important that we consider the consequences of our choices. The philosophical study of *ethics* is the study of right and wrong, including how we can define those terms, recognise right and wrong actions, and understand the connection between actions and consequences. The field of *data ethics* has been around for a long time, and there are many academics focused on this field. It is being used to help define policy in many jurisdictions; it is being used in companies big and small to consider how best to ensure good societal outcomes from product development; and it is being used by researchers who want to make sure that the work they are doing is used for good, and not for bad.\n", + "Because deep learning is such a powerful tool and can be used for so many things, it becomes particularly important that we consider the consequences of our choices. The philosophical study of *ethics* is the study of right and wrong, including how we can define those terms, recognize right and wrong actions, and understand the connection between actions and consequences. The field of *data ethics* has been around for a long time, and there are many academics focused on this field. It is being used to help define policy in many jurisdictions; it is being used in companies big and small to consider how best to ensure good societal outcomes from product development; and it is being used by researchers who want to make sure that the work they are doing is used for good, and not for bad.\n", "\n", "As a deep learning practitioner, therefore, it is likely that at some point you are going to be put in a situation where you need to consider data ethics. So what is data ethics? It's a subfield of ethics, so let's start there." ] @@ -50,30 +50,30 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> J: At university, philosophy of ethics was my main thing (it would have been the topic of my thesis, if I'd finished it, instead of dropping out to join the real-world). Based on the years I spent studying ethics, I can tell you this: no one really agrees on what right and wrong are, whether they exist, how to spot them, which people are good, and which bad, or pretty much anything else. So don't expect too much from the theory! We're going to focus on examples and thought starters here, not theory." + "> J: At university, philosophy of ethics was my main thing (it would have been the topic of my thesis, if I'd finished it, instead of dropping out to join the real world). Based on the years I spent studying ethics, I can tell you this: no one really agrees on what right and wrong are, whether they exist, how to spot them, which people are good, and which bad, or pretty much anything else. So don't expect too much from the theory! We're going to focus on examples and thought starters here, not theory." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In answering the question [What is Ethics](https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/what-is-ethics/), The Markkula Center for Applied Ethics says that *ethics* refers to:\n", + "In answering the question [\"What Is Ethics\"](https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/what-is-ethics/), The Markkula Center for Applied Ethics says that the term refers to:\n", "\n", - "- Well-founded standards of right and wrong that prescribe what humans ought to do, and\n", + "- Well-founded standards of right and wrong that prescribe what humans ought to do\n", "- The study and development of one's ethical standards.\n", "\n", - "There is no list of right answers for ethics. There is no list of do's and dont's. Ethics is complicated, and context-dependent. It involves the perspectives of many stakeholders. Ethics is a muscle that you have to develop and practice. In this chapter, our goal is to provide some signposts to help you on that journey.\n", + "There is no list of right answers. There is no list of do and don't. Ethics is complicated, and context-dependent. It involves the perspectives of many stakeholders. Ethics is a muscle that you have to develop and practice. In this chapter, our goal is to provide some signposts to help you on that journey.\n", "\n", - "Spotting ethical issues is best to do as part of a collaborative team. This is the only way you can really incorporate different perspectives. Different people's backgrounds will help them to see things which may not be obvious to you. Working with a team is helpful for many \"muscle building\" activities, including this one.\n", + "Spotting ethical issues is best to do as part of a collaborative team. This is the only way you can really incorporate different perspectives. Different people's backgrounds will help them to see things which may not be obvious to you. Working with a team is helpful for many \"muscle-building\" activities, including this one.\n", "\n", - "This chapter is certainly not the only part of the book where we talk about data ethics, but it's good to have a place where we focus on it for a while. To get oriented, it's perhaps easiest to look at a few examples. So we picked out three that we think illustrate effectively some of the key topics." + "This chapter is certainly not the only part of the book where we talk about data ethics, but it's good to have a place where we focus on it for a while. To get oriented, it's perhaps easiest to look at a few examples. So, we picked out three that we think illustrate effectively some of the key topics." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Key examples for data ethics" + "## Key Examples for Data Ethics" ] }, { @@ -82,32 +82,32 @@ "source": [ "We are going to start with three specific examples that illustrate three common ethical issues in tech:\n", "\n", - "1. **Recourse processes**: Arkansas's buggy healthcare algorithms left patients stranded\n", - "2. **Feedback loops**: YouTube's recommendation system helped unleash a conspiracy theory boom\n", - "3. **Bias**: When a traditionally African-American name is searched for on Google, it displays ads for criminal background checks\n", + "1. *Recourse processes*--Arkansas's buggy healthcare algorithms left patients stranded.\n", + "2. *Feedback loops*--YouTube's recommendation system helped unleash a conspiracy theory boom.\n", + "3. *Bias*--When a traditionally African-American name is searched for on Google, it displays ads for criminal background checks.\n", "\n", - "In fact, for every concept that we introduce in this chapter, we are going to provide at least one specific example. For each one, have a think about what you could have done in this situation, and think about what kinds of obstructions there might have been to you getting that done. How would you deal with them? What would you look out for?" + "In fact, for every concept that we introduce in this chapter, we are going to provide at least one specific example. For each one, think about what you could have done in this situation, and what kinds of obstructions there might have been to you getting that done. How would you deal with them? What would you look out for?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Bugs and recourse: Buggy algorithm used for healthcare benefits" + "### Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The Verge investigated software used in over half of the U.S. states to determine how much healthcare people receive, and documented their findings in an article [What Happens When an Algorithm Cuts Your Healthcare](https://www.theverge.com/2018/3/21/17144260/healthcare-medicaid-algorithm-arkansas-cerebral-palsy). After implementation of the algorithm in Arkansas, people (many with severe disabilities) drastically had their healthcare cut. For instance, Tammy Dobbs, a woman with cerebral palsy who needs an aid to help her to get out of bed, to go to the bathroom, to get food, and more, had her hours of help suddenly reduced by 20 hours a week. She couldn’t get any explanation for why her healthcare was cut. Eventually, a court case revealed that there were mistakes in the software implementation of the algorithm, negatively impacting people with diabetes or cerebral palsy. However, Dobbs and many other people reliant on these health care benefits live in fear that their benefits could again be cut suddenly and inexplicably." + "The Verge investigated software used in over half of the US states to determine how much healthcare people receive, and documented their findings in the article [\"What Happens When an Algorithm Cuts Your Healthcare\"](https://www.theverge.com/2018/3/21/17144260/healthcare-medicaid-algorithm-arkansas-cerebral-palsy). After implementation of the algorithm in Arkansas, hundreds of people (many with severe disabilities) had their healthcare drastically cut. For instance, Tammy Dobbs, a woman with cerebral palsy who needs an aid to help her to get out of bed, to go to the bathroom, to get food, and more, had her hours of help suddenly reduced by 20 hours a week. She couldn’t get any explanation for why her healthcare was cut. Eventually, a court case revealed that there were mistakes in the software implementation of the algorithm, negatively impacting people with diabetes or cerebral palsy. However, Dobbs and many other people reliant on these healthcare benefits live in fear that their benefits could again be cut suddenly and inexplicably." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Feedback loops: YouTube's recommendation system" + "### Feedback Loops: YouTube's Recommendation System" ] }, { @@ -116,44 +116,44 @@ "source": [ "Feedback loops can occur when your model is controlling the next round of data you get. The data that is returned quickly becomes flawed by the software itself.\n", "\n", - "For instance, in <> we briefly mentioned the reinforcement learning algorithm which Google introduced for YouTube's recommendation system. YouTube has 1.9bn users, who watch over 1 billion hours of YouTube videos a day. Their algorithm, which was designed to optimise watch time, is responsible for around 70% of the content that is watched. It led to out-of-control feedback loops, leading the New York Times to run the headline \"YouTube Unleashed a Conspiracy Theory Boom. Can It Be Contained?\". Ostensibly recommendation systems are predicting what content people will like, but they also have a lot of power in determining what content people even see." + "For instance, YouTube has 1.9 billion users, who watch over 1 billion hours of YouTube videos a day. Its recommendation algorithm (built by Google), which was designed to optimize watch time, is responsible for around 70% of the content that is watched. But there was a problem: it led to out-of-control feedback loops, leading the *New York Times* to run the headline [\"YouTube Unleashed a Conspiracy Theory Boom. Can It Be Contained?\"](https://www.nytimes.com/2019/02/19/technology/youtube-conspiracy-stars.html). Ostensibly recommendation systems are predicting what content people will like, but they also have a lot of power in determining what content people even see." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Bias: Professor Lantanya Sweeney \"arrested\"" + "### Bias: Professor Lantanya Sweeney \"Arrested\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Dr. Latanya Sweeney is a professor at Harvard and director of their data privacy lab. In the paper [Discrimination in Online Ad Delivery](https://arxiv.org/abs/1301.6822) (see <>) she describes her discovery that googling her name resulted in advertisements saying \"Latanya Sweeney arrested\" even though she is the only Latanya Sweeney and has never been arrested. However when she googled other names, such as Kirsten Lindquist, she got more neutral ads, even though Kirsten Lindquist has been arrested three times." + "Dr. Latanya Sweeney is a professor at Harvard and director of the university's data privacy lab. In the paper [\"Discrimination in Online Ad Delivery\"](https://arxiv.org/abs/1301.6822) (see <>) she describes her discovery that Googling her name resulted in advertisements saying \"Latanya Sweeney, arrested?\" even though she is the only known Latanya Sweeney and has never been arrested. However when she Googled other names, such as \"Kirsten Lindquist,\" she got more neutral ads, even though Kirsten Lindquist has been arrested three times." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Screenshot" + "\"Screenshot" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Being a computer scientist, she studied this systematically, and looked at over 2000 names. She found that this pattern held where historically black names received advertisements suggesting that the person had a criminal record. Whereas, white names had more neutral advertisements.\n", + "Being a computer scientist, she studied this systematically, and looked at over 2000 names. She found a clear pattern where historically Black names received advertisements suggesting that the person had a criminal record, whereas, white names had more neutral advertisements.\n", "\n", - "This is an example of bias. It can make a big difference to people's lives — for instance, if a job applicant is googled then it may appear that they have a criminal record when they do not." + "This is an example of bias. It can make a big difference to people's lives—for instance, if a job applicant is Googled it may appear that they have a criminal record when they do not." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Why does this matter?" + "### Why Does This Matter?" ] }, { @@ -162,11 +162,11 @@ "source": [ "One very natural reaction to considering these issues is: \"So what? What's that got to do with me? I'm a data scientist, not a politician. I'm not one of the senior executives at my company who make the decisions about what we do. I'm just trying to build the most predictive model I can.\"\n", "\n", - "These are very reasonable questions. But we're going to try to convince you that the answer is: everybody who is training models absolutely needs to consider how their model will be used. And to consider how to best ensure that it is used as positively as possible. There are things you can do. And if you don't do these things, then things can go pretty badly.\n", + "These are very reasonable questions. But we're going to try to convince you that the answer is that everybody who is training models absolutely needs to consider how their models will be used, and consider how to best ensure that they are used as positively as possible. There are things you can do. And if you don't do them, then things can go pretty badly.\n", "\n", - "One particularly hideous example of what happens when technologists focus on technology at all costs is the story of IBM and Nazi Germany. A Swiss judge ruled \"It does not thus seem unreasonable to deduce that IBM's technical assistance facilitated the tasks of the Nazis in the commission of their crimes against humanity, acts also involving accountancy and classification by IBM machines and utilized in the concentration camps themselves.\"\n", + "One particularly hideous example of what happens when technologists focus on technology at all costs is the story of IBM and Nazi Germany. In 2001, a Swiss judge ruled that it was not unreasonable \"to deduce that IBM's technical assistance facilitated the tasks of the Nazis in the commission of their crimes against humanity, acts also involving accountancy and classification by IBM machines and utilized in the concentration camps themselves.\"\n", "\n", - "IBM, you see, supplied the Nazis with data tabulation products necessary to track the extermination of Jews and other groups on a massive scale. This was driven from the top of the company, with marketing to Hitler and his leadership team. Company President Thomas Watson personally approved the 1939 release of special IBM alphabetizing machines to help organize the deportation of Polish Jews. Pictured here is Adolf Hitler (far left) meeting with IBM CEO Tom Watson Sr. (2nd from left), shortly before Hitler awarded Watson a special “Service to the Reich” medal in 1937:" + "IBM, you see, supplied the Nazis with data tabulation products necessary to track the extermination of Jews and other groups on a massive scale. This was driven from the top of the company, with marketing to Hitler and his leadership team. Company President Thomas Watson personally approved the 1939 release of special IBM alphabetizing machines to help organize the deportation of Polish Jews. Pictured in <> is Adolf Hitler (far left) meeting with IBM CEO Tom Watson Sr. (second from left), shortly before Hitler awarded Watson a special “Service to the Reich” medal in 1937." ] }, { @@ -180,7 +180,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "But it also happened throughout the organization. IBM and its subsidiaries provided regular training and maintenance on-site at the concentration camps: printing off cards, configuring machines, and repairing them as they broke frequently. IBM set up categorizations on their punch card system for the way that each person was killed, which group they were assigned to, and the logistical information necessary to track them through the vast Holocaust system. IBM's code for Jews in the concentration camps was 8, where around 6,000,000 were killed. Its code for Romanis was 12 (they were labeled by the Nazis as \"asocials\", with over 300,000 killed in the *Zigeunerlager*, or “Gypsy camp”). General executions were coded as 4, death in the gas chambers as 6." + "But this was not an isolated incident--the organization's involvement was extensive. IBM and its subsidiaries provided regular training and maintenance onsite at the concentration camps: printing off cards, configuring machines, and repairing them as they broke frequently. IBM set up categorizations on its punch card system for the way that each person was killed, which group they were assigned to, and the logistical information necessary to track them through the vast Holocaust system. IBM's code for Jews in the concentration camps was 8: some 6,000,000 were killed. Its code for Romanis was 12 (they were labeled by the Nazis as \"asocials,\" with over 300,000 killed in the *Zigeunerlager*, or “Gypsy camp”). General executions were coded as 4, death in the gas chambers as 6." ] }, { @@ -194,26 +194,26 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Of course, the project managers and engineers and technicians involved were just living their ordinary lives. Caring for their families, going to the church on Sunday, doing their jobs the best they could. Following orders. The marketers were just doing what they could to meet their business development goals. Edwin Black, author of \"IBM and the Holocaust\", said: \"To the blind technocrat, the means were more important than the ends. The destruction of the Jewish people became even less important because the invigorating nature of IBM's technical achievement was only heightened by the fantastical profits to be made at a time when bread lines stretched across the world.\"\n", + "Of course, the project managers and engineers and technicians involved were just living their ordinary lives. Caring for their families, going to the church on Sunday, doing their jobs the best they could. Following orders. The marketers were just doing what they could to meet their business development goals. As Edwin Black, author of *IBM and the Holocaust* (Dialog Press) observed: \"To the blind technocrat, the means were more important than the ends. The destruction of the Jewish people became even less important because the invigorating nature of IBM's technical achievement was only heightened by the fantastical profits to be made at a time when bread lines stretched across the world.\"\n", "\n", - "Step back for a moment and consider: how would you feel if you discovered that you had been part of a system that ended up hurting society? Would you even know? Would you be open to finding out? How can you help make sure this doesn't happen? We have described the most extreme situation here in Nazi Germany, but there are many negative societal consequences happening due to AI and machine learning right now, some of which we'll describe in this chapter.\n", + "Step back for a moment and consider: How would you feel if you discovered that you had been part of a system that ended up hurting society? Would you be open to finding out? How can you help make sure this doesn't happen? We have described the most extreme situation here, but there are many negative societal consequences linked to AI and machine learning being observed today, some of which we'll describe in this chapter.\n", "\n", - "It's not just a moral burden either. Sometimes, technologists pay very directly for their actions. For instance, the first person who was jailed as a result of the Volkswagen scandal, where the car company cheated on their diesel emissions tests, was not the manager that oversaw the project, or an executive at the helm of the company. It was one of the engineers, James Liang, who just did what he was told.\n", + "It's not just a moral burden, either. Sometimes technologists pay very directly for their actions. For instance, the first person who was jailed as a result of the Volkswagen scandal, where the car company was revealed to have cheated on its diesel emissions tests, was not the manager that oversaw the project, or an executive at the helm of the company. It was one of the engineers, James Liang, who just did what he was told.\n", "\n", - "On the other hand, if a project you are involved in turns out to make a huge positive impact on even one person, this is going to make you feel pretty great!\n", + "Of course, it's not all bad--if a project you are involved in turns out to make a huge positive impact on even one person, this is going to make you feel pretty great!\n", "\n", - "Okay, so hopefully we have convinced you that you ought to care. But what should you do? As data scientists, we're naturally inclined to focus on making our model better at optimizing some metric. But optimizing that metric may not actually lead to better outcomes. And even if optimizing that metric *does* help create better outcomes, it almost certainly won't be the only thing that matters. Consider the pipeline of steps that occurs between the development of a model or an algorithm by a researcher or practitioner, and the point at which this work is actually used to make some decision. This entire pipeline needs to be considered *as a whole* if we're to have a hope of getting the kinds of outcomes we want.\n", + "Okay, so hopefully we have convinced you that you ought to care. But what should you do? As data scientists, we're naturally inclined to focus on making our models better by optimizing some metric or other. But optimizing that metric may not actually lead to better outcomes. And even if it *does* help create better outcomes, it almost certainly won't be the only thing that matters. Consider the pipeline of steps that occurs between the development of a model or an algorithm by a researcher or practitioner, and the point at which this work is actually used to make some decision. This entire pipeline needs to be considered *as a whole* if we're to have a hope of getting the kinds of outcomes we want.\n", "\n", - "Normally there is a very long chain from one end to the other. This is especially true if you are a researcher where you don't even know if your research will ever get used for anything, or if you're involved in data collection, which is even earlier in the pipeline. But no-one is better placed to inform everyone involved in this chain about the capabilities, constraints, and details of your work than you are. Although there's no \"silver bullet\" that can ensure your work is used the right way, by getting involved in the process, and asking the right questions, you can at the very least ensure that the right issues are being considered.\n", + "Normally there is a very long chain from one end to the other. This is especially true if you are a researcher, where you might not even know if your research will ever get used for anything, or if you're involved in data collection, which is even earlier in the pipeline. But no one is better placed to inform everyone involved in this chain about the capabilities, constraints, and details of your work than you are. Although there's no \"silver bullet\" that can ensure your work is used the right way, by getting involved in the process, and asking the right questions, you can at the very least ensure that the right issues are being considered.\n", "\n", - "Sometimes, the right response to being asked to do a piece of work is to just say \"no\". Often, however, the response we hear is \"if I don’t do it, someone else will\". But consider this: if you’ve been picked for the job, you’re the best person they’ve found; so if you don’t do it, the best person isn’t working on that project. If the first 5 they ask all say no too, then even better!" + "Sometimes, the right response to being asked to do a piece of work is to just say \"no.\" Often, however, the response we hear is, \"If I don’t do it, someone else will.\" But consider this: if you’ve been picked for the job, you’re the best person they’ve found to do it--so if you don’t do it, the best person isn’t working on that project. If the first five people they ask all say no too, even better!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Integrating machine learning with product design" + "## Integrating Machine Learning with Product Design" ] }, { @@ -224,25 +224,25 @@ "\n", "These are not just algorithm questions. They are data product design questions. But the product managers, executives, judges, journalists, doctors… whoever ends up developing and using the system of which your model is a part will not be well-placed to understand the decisions that you made, let alone change them.\n", "\n", - "For instance, two studies found that Amazon’s facial recognition software produced [inaccurate](https://www.nytimes.com/2018/07/26/technology/amazon-aclu-facial-recognition-congress.html) and [racially biased results](https://www.theverge.com/2019/1/25/18197137/amazon-rekognition-facial-recognition-bias-race-gender). Amazon claimed that the researchers should have changed the default parameters; they did not explain how it would change the racially biased results. Furthermore, it turned out that [Amazon was not instructing police departments](https://gizmodo.com/defense-of-amazons-face-recognition-tool-undermined-by-1832238149) that used its software to do this either. There was, presumably, a big distance between the researchers that developed these algorithms, and the Amazon documentation staff that wrote the guidelines provided to the police. A lack of tight integration led to serious problems for society, the police, and Amazon themselves. It turned out that their system erroneously *matched* 28 members of congress to criminal mugshots! (And these members of congress wrongly matched to criminal mugshots disproportionately included people of color as seen in <>.)" + "For instance, two studies found that Amazon’s facial recognition software produced [inaccurate](https://www.nytimes.com/2018/07/26/technology/amazon-aclu-facial-recognition-congress.html) and [racially biased](https://www.theverge.com/2019/1/25/18197137/amazon-rekognition-facial-recognition-bias-race-gender) results. Amazon claimed that the researchers should have changed the default parameters, without explaining how this would have changed the biased results. Furthermore, it turned out that [Amazon was not instructing police departments](https://gizmodo.com/defense-of-amazons-face-recognition-tool-undermined-by-1832238149) that used its software to do this either. There was, presumably, a big distance between the researchers that developed these algorithms and the Amazon documentation staff that wrote the guidelines provided to the police. A lack of tight integration led to serious problems for society at large, the police, and Amazon themselves. It turned out that their system erroneously matched 28 members of congress to criminal mugshots! (And the Congresspeople wrongly matched to criminal mugshots were disproportionately people of color, as seen in <>.)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Picture" + "\"Picture" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Data scientists need to be part of a cross-disciplinary team. And researchers need to work closely with the kinds of people who will end up using their research. Better still is if the domain experts themselves have learnt enough to be able to train and debug some models themselves — hopefully there are a few of you reading this book right now!\n", + "Data scientists need to be part of a cross-disciplinary team. And researchers need to work closely with the kinds of people who will end up using their research. Better still is if the domain experts themselves have learned enough to be able to train and debug some models themselves—hopefully there are a few of you reading this book right now!\n", "\n", - "The modern workplace is a very specialised place. Everybody tends to have very well-defined jobs to perform. Especially in large companies, it can be very hard to know what all the pieces of the puzzle are. Sometimes companies even intentionally obscure the overall project goals that are being worked on, if they know that their employees are not going to like the answers. This is sometimes done by compartmentalising pieces as much as possible.\n", + "The modern workplace is a very specialized place. Everybody tends to have well-defined jobs to perform. Especially in large companies, it can be hard to know what all the pieces of the puzzle are. Sometimes companies even intentionally obscure the overall project goals that are being worked on, if they know that their employees are not going to like the answers. This is sometimes done by compartmentalising pieces as much as possible.\n", "\n", - "In other words, we're not saying that any of this is easy. It's hard. It's really hard. We all have to do our best. And we have often seen that the people who do get involved in the higher-level context of these projects, and attempt to develop cross-disciplinary capabilities and teams, become some of the most important and well rewarded members of their organisations. It's the kind of work that tends to be highly appreciated by senior executives, even if it is sometimes considered rather uncomfortable by middle management." + "In other words, we're not saying that any of this is easy. It's hard. It's really hard. We all have to do our best. And we have often seen that the people who do get involved in the higher-level context of these projects, and attempt to develop cross-disciplinary capabilities and teams, become some of the most important and well rewarded members of their organizations. It's the kind of work that tends to be highly appreciated by senior executives, even if it is sometimes considered rather uncomfortable by middle management." ] }, { @@ -256,12 +256,12 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Data ethics is a big field, and we can't cover everything. Instead, we're going to pick a few topics which we think are particularly relevant:\n", + "Data ethics is a big field, and we can't cover everything. Instead, we're going to pick a few topics that we think are particularly relevant:\n", "\n", - "- need for recourse and accountability\n", - "- feedback loops\n", - "- bias\n", - "- disinformation" + "- The need for recourse and accountability\n", + "- Feedback loops\n", + "- Bias\n", + "- Disinformation" ] }, { @@ -275,16 +275,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Recourse and accountability" + "### Recourse and Accountability" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In a complex system, it is easy for no one person to feel responsible for outcomes. While this is understandable, it does not lead to good results. In the earlier example of the Arkansas healthcare system in which a bug led to people with cerebral palsy losing access to needed care, the creator of the algorithm blamed government officials, and government officials could blame those who implemented the software. NYU professor Danah Boyd described this phenomenon: \"bureaucracy has often been used to evade responsibility, and today's algorithmic systems are extending bureaucracy.\"\n", + "In a complex system, it is easy for no one person to feel responsible for outcomes. While this is understandable, it does not lead to good results. In the earlier example of the Arkansas healthcare system in which a bug led to people with cerebral palsy losing access to needed care, the creator of the algorithm blamed government officials, and government officials blamed those who implemented the software. NYU professor [Danah Boyd](https://www.youtube.com/watch?v=NTl0yyPqf3E) described this phenomenon: \"Bureaucracy has often been used to shift or evade responsibility... Today's algorithmic systems are extending bureaucracy.\"\n", "\n", - "An additional reason why recourse is so necessary is because data often contains errors. Mechanisms for audits and error-correction are crucial. A database of suspected gang members maintained by California law enforcement officials was found to be full of errors, including 42 babies who had been added to the database when they were less than 1 year old (28 of whom were marked as “admitting to being gang members”). In this case, there was no process in place for correcting mistakes or removing people once they’d been added. Another example is the US credit report system: in a large-scale study of credit reports by the FTC (Federal Trade Commission) in 2012, it was found that 26% of consumers had at least one mistake in their files, and 5% had errors that could be devastating. Yet, the process of getting such errors corrected is incredibly slow and opaque. When public-radio reporter Bobby Allyn discovered that he was erroneously listed as having a firearms conviction, it took him \"more than a dozen phone calls, the handiwork of a county court clerk and six weeks to solve the problem. And that was only after I contacted the company’s communications department as a journalist.\" (as covered in the article [How the careless errors of credit reporting agencies are ruining people’s lives](https://www.washingtonpost.com/posteverything/wp/2016/09/08/how-the-careless-errors-of-credit-reporting-agencies-are-ruining-peoples-lives/))\n", + "An additional reason why recourse is so necessary is because data often contains errors. Mechanisms for audits and error correction are crucial. A database of suspected gang members maintained by California law enforcement officials was found to be full of errors, including 42 babies who had been added to the database when they were less than 1 year old (28 of whom were marked as “admitting to being gang members”). In this case, there was no process in place for correcting mistakes or removing people once they’d been added. Another example is the US credit report system: in a large-scale study of credit reports by the Federal Trade Commission (FTC) in 2012, it was found that 26% of consumers had at least one mistake in their files, and 5% had errors that could be devastating. Yet, the process of getting such errors corrected is incredibly slow and opaque. When public radio reporter [Bobby Allyn](https://www.washingtonpost.com/posteverything/wp/2016/09/08/how-the-careless-errors-of-credit-reporting-agencies-are-ruining-peoples-lives/) discovered that he was erroneously listed as having a firearms conviction, it took him \"more than a dozen phone calls, the handiwork of a county court clerk and six weeks to solve the problem. And that was only after I contacted the company’s communications department as a journalist.\"\n", "\n", "As machine learning practitioners, we do not always think of it as our responsibility to understand how our algorithms end up being implemented in practice. But we need to." ] @@ -293,19 +293,19 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Feedback loops" + "### Feedback Loops" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We have already explained in <> how an algorithm can interact with its enviromnent to create a feedback loop, making predictions that reinforce actions taken in the real world, which lead to predictions even more pronounced in the same direction. \n", - "As an example, we'll discuss YouTube's recommendation system. A couple of years ago Google talked about how they had introduced reinforcement learning (closely related to deep learning, but where your loss function represents a result which could be a long time after an action occurs) to improve their recommendation system. They described how they used an algorithm which made recommendations such that watch time would be optimised.\n", + "We explained in <> how an algorithm can interact with its enviromnent to create a feedback loop, making predictions that reinforce actions taken in the real world, which lead to predictions even more pronounced in the same direction. \n", + "As an example, let's again consider YouTube's recommendation system. A couple of years ago the Google team talked about how they had introduced reinforcement learning (closely related to deep learning, but where your loss function represents a result potentially a long time after an action occurs) to improve YouTube's recommendation system. They described how they used an algorithm that made recommendations such that watch time would be optimized.\n", "\n", - "However, human beings tend to be drawn towards controversial content. This meant that videos about things like conspiracy theories started to get recommended more and more by the recommendation system. Furthermore, it turns out that the kinds of people that are interested in conspiracy theories are also people that watch a lot of online videos! So, they started to get drawn more and more towards YouTube. The increasing number of conspiracy theorists watching YouTube resulted in the algorithm recommending more and more conspiracy theories and other extremist content, which resulted in more extremists watching videos on YouTube, and more people watching YouTube developing extremist views, which led to the algorithm recommending more extremist content... The system became so out of control that in February 2019 it led the New York Times to run the headline \"YouTube Unleashed a Conspiracy Theory Boom. Can It Be Contained?\"footnote:[https://www.nytimes.com/2019/02/19/technology/youtube-conspiracy-stars.html]\n", + "However, human beings tend to be drawn to controversial content. This meant that videos about things like conspiracy theories started to get recommended more and more by the recommendation system. Furthermore, it turns out that the kinds of people that are interested in conspiracy theories are also people that watch a lot of online videos! So, they started to get drawn more and more toward YouTube. The increasing number of conspiracy theorists watching videos on YouTube resulted in the algorithm recommending more and more conspiracy theory and other extremist content, which resulted in more extremists watching videos on YouTube, and more people watching YouTube developing extremist views, which led to the algorithm recommending more extremist content... The system was spiraling out of control.\n", "\n", - "The New York Times published another article on YouTube's recommendation system, titled [On YouTube’s Digital Playground, an Open Gate for Pedophiles](https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html). The article started with this chilling story:" + "And this phenomenon was not contained to this particular type of content. In June 2019 the *New York Times* published an article on YouTube's recommendation system, titled [\"On YouTube’s Digital Playground, an Open Gate for Pedophiles\"](https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html). The article started with this chilling story:" ] }, { @@ -325,9 +325,9 @@ "\n", "No one at Google planned to create a system that turned family videos into porn for pedophiles. So what happened?\n", "\n", - "Part of the problem here is the centrality of metrics in driving a financially important system. When an algorithm has a metric to optimise, as you have seen, it will do everything it can to optimise that number. This tends to lead to all kinds of edge cases, and humans interacting with a system will search for, find, and exploit these edge cases and feedback loops for their advantage.\n", + "Part of the problem here is the centrality of metrics in driving a financially important system. When an algorithm has a metric to optimize, as you have seen, it will do everything it can to optimize that number. This tends to lead to all kinds of edge cases, and humans interacting with a system will search for, find, and exploit these edge cases and feedback loops for their advantage.\n", "\n", - "There are signs that this is exactly what has happened with YouTube's recommendation system. The Guardian ran an article [How an ex-YouTube insider investigated its secret algorithm](https://www.theguardian.com/technology/2018/feb/02/youtube-algorithm-election-clinton-trump-guillaume-chaslot) about Guillaume Chaslot, an ex-YouTube engineer who created AlgoTransparency, which tracks these issues. Chaslot published the chart in <>, following the release of Robert Mueller's \"Report on the Investigation Into Russian Interference in the 2016 Presidential Election\"." + "There are signs that this is exactly what has happened with YouTube's recommendation system. *The Guardian* ran an article called [\"How an ex-YouTube Insider Investigated its Secret Algorithm\"](https://www.theguardian.com/technology/2018/feb/02/youtube-algorithm-election-clinton-trump-guillaume-chaslot) about Guillaume Chaslot, an ex-YouTube engineer who created AlgoTransparency, which tracks these issues. Chaslot published the chart in <>, following the release of Robert Mueller's \"Report on the Investigation Into Russian Interference in the 2016 Presidential Election.\"" ] }, { @@ -341,29 +341,29 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Russia Today's coverage of the Mueller report was an extreme outlier in how many channels were recommending it. This suggests the possibility that Russia Today, a state-owned Russia media outlet, has been successful in gaming YouTube's recommendation algorithm. The lack of transparency of systems like this makes it hard to uncover the kinds of problems that we're discussing.\n", + "Russia Today's coverage of the Mueller report was an extreme outlier in terms of how many channels were recommending it. This suggests the possibility that Russia Today, a state-owned Russia media outlet, has been successful in gaming YouTube's recommendation algorithm. Unfortunately, the lack of transparency of systems like this makes it hard to uncover the kinds of problems that we're discussing.\n", "\n", - "One of our reviewers for this book, Aurélien Géron, led YouTube's video classification team from 2013 to 2016 (well before the events discussed above). He pointed out that it's not just feedback loops involving humans that are a problem. There can also be feedback loops without humans! He told us about an example from YouTube:\n", + "One of our reviewers for this book, Aurélien Géron, led YouTube's video classification team from 2013 to 2016 (well before the events discussed here). He pointed out that it's not just feedback loops involving humans that are a problem. There can also be feedback loops without humans! He told us about an example from YouTube:\n", "\n", - "> : \"One important signal to classify the main topic of a video is the channel it comes from. For example, a video uploaded to a cooking channel is very likely to be a cooking video. But how do we know what topic a channel is about? Well… in part by looking at the topics of the videos it contains! Do you see the loop? For example, many videos have a description which indicates what camera was used to shoot the video. As a result, some of these videos might get classified as videos about “photography”. If a channel has such a misclassified video, it might be classified as a “photography” channel, making it even more likely for future videos on this channel to be wrongly classified as “photography”. This could even lead to runaway virus-like classifications! One way to break this feedback loop is to classify videos with and without the channel signal. Then when classifying the channels, you can only use the classes obtained without the channel signal. This way, the feedback loop is broken.\"\n", + "> : One important signal to classify the main topic of a video is the channel it comes from. For example, a video uploaded to a cooking channel is very likely to be a cooking video. But how do we know what topic a channel is about? Well… in part by looking at the topics of the videos it contains! Do you see the loop? For example, many videos have a description which indicates what camera was used to shoot the video. As a result, some of these videos might get classified as videos about “photography.” If a channel has such a misclassified video, it might be classified as a “photography” channel, making it even more likely for future videos on this channel to be wrongly classified as “photography.” This could even lead to runaway virus-like classifications! One way to break this feedback loop is to classify videos with and without the channel signal. Then when classifying the channels, you can only use the classes obtained without the channel signal. This way, the feedback loop is broken.\n", "\n", - "There are positive examples of people and organizations attempting to combat these problems. Evan Estola, lead machine learning engineer at Meetup, [discussed the example](https://www.youtube.com/watch?v=MqoRzNhrTnQ) of men expressing more interest than women in tech meetups. Meetup’s algorithm could recommend fewer tech meetups to women, and as a result, fewer women would find out about and attend tech meetups, which could cause the algorithm to suggest even fewer tech meetups to women, and so on in a self-reinforcing feedback loop. Evan and his team made the ethical decision for their recommendation algorithm to not create such a feedback loop, by explicitly not using gender for that part of their model. It is encouraging to see a company not just unthinkingly optimize a metric, but to consider its impact. \"You need to decide which feature not to use in your algorithm… the most optimal algorithm is perhaps not the best one to launch into production\", he said.\n", + "There are positive examples of people and organizations attempting to combat these problems. Evan Estola, lead machine learning engineer at Meetup, [discussed the example](https://www.youtube.com/watch?v=MqoRzNhrTnQ) of men expressing more interest than women in tech meetups. taking gender into account could therefore cause Meetup’s algorithm to recommend fewer tech meetups to women, and as a result, fewer women would find out about and attend tech meetups, which could cause the algorithm to suggest even fewer tech meetups to women, and so on in a self-reinforcing feedback loop. So, Evan and his team made the ethical decision for their recommendation algorithm to not create such a feedback loop, by explicitly not using gender for that part of their model. It is encouraging to see a company not just unthinkingly optimize a metric, but consider its impact. According to Evan, \"You need to decide which feature not to use in your algorithm... the most optimal algorithm is perhaps not the best one to launch into production.\"\n", "\n", - "While Meetup chose to avoid such an outcome, Facebook provides an example of allowing a runaway feedback loop to run wild. Facebook radicalizes users interested in one conspiracy theory by introducing them to more. As [Renee DiResta, a researcher on proliferation of disinformation, writes](https://www.fastcompany.com/3059742/social-network-algorithms-are-distorting-reality-by-boosting-conspiracy-theories):" + "While Meetup chose to avoid such an outcome, Facebook provides an example of allowing a runaway feedback loop to run wild. Like YouTube, it tends to radicalize users interested in one conspiracy theory by introducing them to more. As Renee DiResta, a researcher on proliferation of disinformation, [writes](https://www.fastcompany.com/3059742/social-network-algorithms-are-distorting-reality-by-boosting-conspiracy-theories):" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> : \"once people join a single conspiracy-minded \\[Facebook\\] group, they are algorithmically routed to a plethora of others. Join an anti-vaccine group, and your suggestions will include anti-GMO, chemtrail watch, flat Earther (yes, really), and ‘curing cancer naturally’ groups. Rather than pulling a user out of the rabbit hole, the recommendation engine pushes them further in.\"" + "> : Once people join a single conspiracy-minded [Facebook] group, they are algorithmically routed to a plethora of others. Join an anti-vaccine group, and your suggestions will include anti-GMO, chemtrail watch, flat Earther (yes, really), and \"curing cancer naturally groups. Rather than pulling a user out of the rabbit hole, the recommendation engine pushes them further in.\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "It is extremely important to keep in mind this kind of behavior can happen, and to either anticipate a feedback loop or take positive action to break it when you can see the first signs of it in your own projects. Another thing to keep in mind is *bias*, which, as we discussed in the previous chapter, can interact with feedback loops in very troublesome ways." + "It is extremely important to keep in mind that this kind of behavior can happen, and to either anticipate a feedback loop or take positive action to break it when you see the first signs of it in your own projects. Another thing to keep in mind is *bias*, which, as we discussed briefly in the previous chapter, can interact with feedback loops in very troublesome ways." ] }, { @@ -377,16 +377,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Discussions of bias online tend to get pretty confusing pretty fast. The word bias means so many different things. Statisticians often think that when data ethicists are talking about bias that they're talking about the statistical definition of the term bias. But they're not. And they're certainly not talking about the biases that appear in the weights and biases which are the parameters of your model!\n", + "Discussions of bias online tend to get pretty confusing pretty fast. The word \"bias\" means so many different things. Statisticians often think when data ethicists are talking about bias that they're talking about the statistical definition of the term bias. But they're not. And they're certainly not talking about the biases that appear in the weights and biases which are the parameters of your model!\n", "\n", - "What they're talking about is the social science concept of bias. In [A Framework for Understanding Unintended Consequences of Machine Learning](https://arxiv.org/abs/1901.10002) MIT's Suresh and Guttag describe six types of bias in machine learning, summarized in <> from their paper." + "What they're talking about is the social science concept of bias. In [\"A Framework for Understanding Unintended Consequences of Machine Learning\"](https://arxiv.org/abs/1901.10002) MIT's Harini Suresh and John Guttag describe six types of bias in machine learning, summarized in <> from their paper." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"A" + "\"A" ] }, { @@ -407,16 +407,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "*Historical bias* comes from the fact that people are biased, processes are biased, and society is biased. Suresh and Guttag say: \"Historical bias is a fundamental, structural issue with the first step of the data generation process and can exist even given perfect sampling and feature selection\".\n", + "*Historical bias* comes from the fact that people are biased, processes are biased, and society is biased. Suresh and Guttag say: \"Historical bias is a fundamental, structural issue with the first step of the data generation process and can exist even given perfect sampling and feature selection.\"\n", "\n", - "For instance, here's a few examples of historical *race bias* in the US, from the NY Times article [Racial Bias, Even When We Have Good Intentions](https://www.nytimes.com/2015/01/04/upshot/the-measuring-sticks-of-racial-bias-.html), by the University of Chicago's Sendhil Mullainathan:\n", + "For instance, here are a few examples of historical *race bias* in the US, from the *New York Times* article [\"Racial Bias, Even When We Have Good Intentions\"](https://www.nytimes.com/2015/01/04/upshot/the-measuring-sticks-of-racial-bias-.html) by the University of Chicago's Sendhil Mullainathan:\n", "\n", - " - When doctors were shown identical files, they were much less likely to recommend cardiac catheterization (a helpful procedure) to Black patients\n", - " - When bargaining for a used car, Black people were offered initial prices $700 higher and received far smaller concessions\n", - " - Responding to apartment-rental ads on Craigslist with a Black name elicited fewer responses than with a white name\n", - " - An all-white jury was 16 percentage points more likely to convict a Black defendant than a white one, but when a jury had 1 Black member, it convicted both at the same rate\n", + " - When doctors were shown identical files, they were much less likely to recommend cardiac catheterization (a helpful procedure) to Black patients.\n", + " - When bargaining for a used car, Black people were offered initial prices $700 higher and received far smaller concessions.\n", + " - Responding to apartment rental ads on Craigslist with a Black name elicited fewer responses than with a white name.\n", + " - An all-white jury was 16 percentage points more likely to convict a Black defendant than a white one, but when a jury had one Black member it convicted both at the same rate.\n", "\n", - "The COMPAS algorithm, widely used for sentencing and bail decisions in the US, is an example of an important algorithm which, when tested by ProPublica, showed clear racial bias in practice:" + "The COMPAS algorithm, widely used for sentencing and bail decisions in the US, is an example of an important algorithm that, when tested by [ProPublica](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing), showed clear racial bias in practice (<>)." ] }, { @@ -430,7 +430,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Any dataset involving humans can have this kind of bias, such as medical data, sales data, housing data, political data, and so on. Because underlying bias is so pervasive, bias in datasets is very pervasive. Racial bias even turns up in computer vision, as shown in this example of auto-categorized photos shared on Twitter by a Google Photos user:" + "Any dataset involving humans can have this kind of bias: medical data, sales data, housing data, political data, and so on. Because underlying bias is so pervasive, bias in datasets is very pervasive. Racial bias even turns up in computer vision, as shown in the example of autocategorized photos shared on Twitter by a Google Photos user shown in <>." ] }, { @@ -446,21 +446,21 @@ "source": [ "Yes, that is showing what you think it is: Google Photos classified a Black user's photo with their friend as \"gorillas\"! This algorithmic misstep got a lot of attention in the media. “We’re appalled and genuinely sorry that this happened,” a company spokeswoman said. “There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”\n", "\n", - "Unfortunately, fixing problems in machine learning systems when the input data has problems is hard. Google's first attempt didn't inspire confidence, as covered by The Guardian:" + "Unfortunately, fixing problems in machine learning systems when the input data has problems is hard. Google's first attempt didn't inspire confidence, as coverage by *The Guardian* suggested (<>)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Pictures" + "\"Pictures" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "These kinds of problem are certainly not limited to just Google. MIT researchers studied the most popular online computer vision APIs to see how accurate they were. But they didn't just calculate a single accuracy number—instead, they looked at the accuracy across four different groups:" + "These kinds of problems are certainly not limited to just Google. MIT researchers studied the most popular online computer vision APIs to see how accurate they were. But they didn't just calculate a single accuracy number—instead, they looked at the accuracy across four different groups, as illustrated in <>." ] }, { @@ -474,11 +474,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "IBM's system, for instance, had a 34.7% error rate for darker females, vs 0.3% for lighter males—over 100 times more errors! Some people incorrectly reacted to these experiments by claiming that the difference was simply because darker skin is harder for computers to recognise. However, what actually happened is that, after the negative publicity that this result created, all of the companies in question dramatically improved their models for darker skin, such that one year later they were nearly as good as for lighter skin. So what this actually showed is that the developers failed to utilise datasets containing enough darker faces, or test their product with darker faces.\n", + "IBM's system, for instance, had a 34.7% error rate for darker females, versus 0.3% for lighter males—over 100 times more errors! Some people incorrectly reacted to these experiments by claiming that the difference was simply because darker skin is harder for computers to recognize. However, what actually happened was that, after the negative publicity that this result created, all of the companies in question dramatically improved their models for darker skin, such that one year later they were nearly as good as for lighter skin. So what this actually showed is that the developers failed to utilize datasets containing enough darker faces, or test their product with darker faces.\n", "\n", - "One of the MIT researchers, Joy Buolamwini, warned, \"We have entered the age of automation overconfident yet underprepared. If we fail to make ethical and inclusive artificial intelligence, we risk losing gains made in civil rights and gender equity under the guise of machine neutrality\".\n", + "One of the MIT researchers, Joy Buolamwini, warned: \"We have entered the age of automation overconfident yet underprepared. If we fail to make ethical and inclusive artificial intelligence, we risk losing gains made in civil rights and gender equity under the guise of machine neutrality.\"\n", "\n", - "Part of the issue appears to be a systematic imbalance in the make up of popular datasets used for training models. The abstract to the paper [No Classification without Representation: Assessing Geodiversity Issues in Open Data Sets for the Developing World](https://arxiv.org/abs/1711.08536) states, \"We analyze two large, publicly available image data sets to assess geo-diversity and find that these data sets appear to exhibit an observable amerocentric and eurocentric representation bias. Further, we analyze classifiers trained on these data sets to assess the impact of these training distributions and find strong differences in the relative performance on images from different locales\". <> shows one of the charts from the paper, showing the geographic make up of what was, at the time (and still, as this book is being written), the two most important image datasets for training models." + "Part of the issue appears to be a systematic imbalance in the makeup of popular datasets used for training models. The abstract to the paper [\"No Classification Without Representation: Assessing Geodiversity Issues in Open Data Sets for the Developing World\"](https://arxiv.org/abs/1711.08536) by Shreya Shankar et al. states, \"We analyze two large, publicly available image data sets to assess geo-diversity and find that these data sets appear to exhibit an observable amerocentric and eurocentric representation bias. Further, we analyze classifiers trained on these data sets to assess the impact of these training distributions and find strong differences in the relative performance on images from different locales.\" <> shows one of the charts from the paper, showing the geographic makeup of what was, at the time (and still are, as this book is being written) the two most important image datasets for training models." ] }, { @@ -492,7 +492,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The vast majority of the images are from the United States and other Western countries, leading to models trained on ImageNet performing worse on scenes from other countries and cultures. For instance, [research](https://arxiv.org/pdf/1906.02659.pdf) found that such models are worse at identifying household items (such as soap, spices, sofas, or beds) from lower-income countries. <> shows an image from the paper, [Does Object Recognition Work for Everyone?](https://arxiv.org/pdf/1906.02659.pdf)." + "The vast majority of the images are from the United States and other Western countries, leading to models trained on ImageNet performing worse on scenes from other countries and cultures. For instance, research found that such models are worse at identifying household items (such as soap, spices, sofas, or beds) from lower-income countries. <> shows an image from the paper, [\"Does Object Recognition Work for Everyone?\"](https://arxiv.org/pdf/1906.02659.pdf) by Terrance DeVries et al. of Facebook AI Research that illustrates this point." ] }, { @@ -510,7 +510,7 @@ "\n", "As we will discuss shortly, in addition, the vast majority of AI researchers and developers are young white men. Most projects that we have seen do most user testing using friends and families of the immediate product development group. Given this, the kinds of problems we just discussed should not be surprising.\n", "\n", - "Similar historical bias is found in the texts used as data for natural language processing models. This crops up in downstream machine learning tasks in many ways. For instance, it [was widely reported](https://nypost.com/2017/11/30/google-translates-algorithm-has-a-gender-bias/) that until last year Google Translate showed systematic bias in how it translated the Turkish gender-neutral pronoun \"o\" into English. For instance, when applied to jobs which are often associated with males, it used \"he\", and when applied to jobs which are often associated with females, it used \"she\":" + "Similar historical bias is found in the texts used as data for natural language processing models. This crops up in downstream machine learning tasks in many ways. For instance, it [was widely reported](https://nypost.com/2017/11/30/google-translates-algorithm-has-a-gender-bias/) that until last year Google Translate showed systematic bias in how it translated the Turkish gender-neutral pronoun \"o\" into English: when applied to jobs which are often associated with males it used \"he,\" and when applied to jobs which are often associated with females it used \"she\" (<>)." ] }, { @@ -524,7 +524,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We also see this kind of bias in online advertisements. For instance, a study in 2019 found that even when the person placing the ad does not intentionally discriminate, Facebook will show the ad to very different audiences based on race and gender. Housing ads with the same text, but changing the picture between a white or black family, were shown to racially different audiences." + "We also see this kind of bias in online advertisements. For instance, a [study](https://arxiv.org/abs/1904.02095) in 2019 by Muhammad Ali et al. found that even when the person placing the ad does not intentionally discriminate, Facebook will show ads to very different audiences based on race and gender. Housing ads with the same text, but picture either a white or a Black family, were shown to racially different audiences." ] }, { @@ -538,48 +538,48 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In the paper [Does Machine Learning Automate Moral Hazard and Error](https://scholar.harvard.edu/files/sendhil/files/aer.p20171084.pdf) in *American Economic Review*, the authors look at a model that tries to answer the question: using historical electronic health record (EHR) data, what factors are most predictive of stroke? These are the top predictors from the model:\n", + "In the paper [\"Does Machine Learning Automate Moral Hazard and Error\"](https://scholar.harvard.edu/files/sendhil/files/aer.p20171084.pdf) in *American Economic Review*, Sendhil Mullainathan and Ziad Obermeyer look at a model that tries to answer the question: using historical electronic health record (EHR) data, what factors are most predictive of stroke? These are the top predictors from the model:\n", "\n", - " - Prior Stroke\n", + " - Prior stroke\n", " - Cardiovascular disease\n", " - Accidental injury\n", " - Benign breast lump\n", " - Colonoscopy\n", " - Sinusitis\n", "\n", - "However, only the top two have anything to do with a stroke! Based on what we've studied so far, you can probably guess why. We haven’t really measured *stroke*, which occurs when a region of the brain is denied oxygen due to an interruption in the blood supply. What we’ve measured is who: had symptoms, went to a doctor, got the appropriate tests, AND received a diagnosis of stroke. Actually having a stroke is not the only thing correlated with this complete list — it's also correlated with being the kind of person who actually goes to the doctor (which is influenced by who has access to healthcare, can afford their co-pay, doesn't experience racial or gender-based medical discrimination, and more)! If you are likely to go to the doctor for an *accidental injury*, then you are likely to also go the doctor when you are having a stroke.\n", + "However, only the top two have anything to do with a stroke! Based on what we've studied so far, you can probably guess why. We haven’t really measured *stroke*, which occurs when a region of the brain is denied oxygen due to an interruption in the blood supply. What we’ve measured is who had symptoms, went to a doctor, got the appropriate tests, *and* received a diagnosis of stroke. Actually having a stroke is not the only thing correlated with this complete list—it's also correlated with being the kind of person who actually goes to the doctor (which is influenced by who has access to healthcare, can afford their co-pay, doesn't experience racial or gender-based medical discrimination, and more)! If you are likely to go to the doctor for an *accidental injury*, then you are likely to also go the doctor when you are having a stroke.\n", "\n", - "This is an example of *measurement bias*. It occurs when our models make mistakes because we are measuring the wrong thing, or measuring it in the wrong way, or incorporating that measurement into our model inappropriately." + "This is an example of *measurement bias*. It occurs when our models make mistakes because we are measuring the wrong thing, or measuring it in the wrong way, or incorporating that measurement into the model inappropriately." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "#### Aggregation Bias" + "#### Aggregation bias" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "*Aggregation bias* occurs when models do not aggregate data in a way that incorporates all of the appropriate factors, or when a model does not include the necessary interaction terms, nonlinearities, or so forth. This can particularly occur in medical settings. For instance, the way diabetes is treated is often based on simple univariate statistics and studies involving small groups of heterogeneous people. Analysis of results is often done in a way that does not take account of different ethnicities or genders. However, it turns out that diabetes patients have [different complications across ethnicities](https://www.ncbi.nlm.nih.gov/pubmed/24037313), and HbA1c levels (widely used to diagnose and monitor diabetes) [differ in complex ways across ethnicities and genders](https://www.ncbi.nlm.nih.gov/pubmed/22238408). This can result in people being misdiagnosed or incorrectly treated because medical decisions are based on a model which does not include these important variables and interactions." + "*Aggregation bias* occurs when models do not aggregate data in a way that incorporates all of the appropriate factors, or when a model does not include the necessary interaction terms, nonlinearities, or so forth. This can particularly occur in medical settings. For instance, the way diabetes is treated is often based on simple univariate statistics and studies involving small groups of heterogeneous people. Analysis of results is often done in a way that does not take account of different ethnicities or genders. However, it turns out that diabetes patients have [different complications across ethnicities](https://www.ncbi.nlm.nih.gov/pubmed/24037313), and HbA1c levels (widely used to diagnose and monitor diabetes) [differ in complex ways across ethnicities and genders](https://www.ncbi.nlm.nih.gov/pubmed/22238408). This can result in people being misdiagnosed or incorrectly treated because medical decisions are based on a model that does not include these important variables and interactions." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "#### Representation Bias" + "#### Representation bias" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The abstract of the paper [Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting](https://arxiv.org/abs/1901.09451) notes that there is gender imbalance in occupations (e.g. females are more likely to be nurses, and males are more likely to be pastors), and says that: \"differences in true positive rates between genders are correlated with existing gender imbalances in occupations, which may compound these imbalances\".\n", + "The abstract of the paper [\"Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting\"](https://arxiv.org/abs/1901.09451) by Maria De-Arteaga et al. notes that there is gender imbalance in occupations (e.g., females are more likely to be nurses, and males are more likely to be pastors), and says that: \"differences in true positive rates between genders are correlated with existing gender imbalances in occupations, which may compound these imbalances.\"\n", "\n", - "What this is saying is that the researchers noticed that models predicting occupation did not only reflect the actual gender imbalance in the underlying population, but actually amplified it! This is quite common, particularly for simple models. When there is some clear, easy-to-see underlying relationship, a simple model will often simply assume that this relationship holds all the time. As <> from the paper shows, for occupations which had a higher percentage of females, the model tended to overestimate the prevalence of that occupation." + "In other words, the researchers noticed that models predicting occupation did not only *reflect* the actual gender imbalance in the underlying population, but actually *amplified* it! This type of *representation bias* is quite common, particularly for simple models. When there is some clear, easy-to-see underlying relationship, a simple model will often simply assume that this relationship holds all the time. As <> from the paper shows, for occupations that had a higher percentage of females, the model tended to overestimate the prevalence of that occupation." ] }, { @@ -593,7 +593,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For example, in the training dataset, 14.6% of surgeons were women, yet in the model predictions, only 11.6% of the true positives were women. The model is thus amplifying the bias existing in the training set.\n", + "For example, in the training dataset 14.6% of surgeons were women, yet in the model predictions only 11.6% of the true positives were women. The model is thus amplifying the bias existing in the training set.\n", "\n", "Now that we've seen that those biases exist, what can we do to mitigate them?" ] @@ -602,40 +602,33 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Addressing different types of bias" + "### Addressing different types of bias" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Different types of bias require different approaches for mitigation. While gathering a more diverse dataset can address representation bias, this would not help with historical bias or measurement bias. All datasets contain bias. There is no such thing as a completely de-biased dataset. Many researchers in the field have been converging on a set of proposals towards better documenting the decisions, context, and specifics about how and why a particular dataset was created, what scenarios it is appropriate to use in, and what the limitations are. This way, those using the dataset will not be caught off-guard by its biases and limitations." + "Different types of bias require different approaches for mitigation. While gathering a more diverse dataset can address representation bias, this would not help with historical bias or measurement bias. All datasets contain bias. There is no such thing as a completely debiased dataset. Many researchers in the field have been converging on a set of proposals to enable better documentation of the decisions, context, and specifics about how and why a particular dataset was created, what scenarios it is appropriate to use in, and what the limitations are. This way, those using a particular dataset will not be caught off guard by its biases and limitations." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Humans are biased, so does algorithmic bias matter?" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We often hear this question — \"humans are biased, so does algorithmic bias even matter?\" This comes up so often, there must be some reasoning that makes sense to the people that ask it, but it doesn't seem very logically sound to us! Independently of whether this is logically sound, it's important to realise that algorithms and people are different. Machine learning, particularly so. Consider these points about machine learning algorithms:\n", + "We often hear the question—\"Humans are biased, so does algorithmic bias even matter?\" This comes up so often, there must be some reasoning that makes sense to the people that ask it, but it doesn't seem very logically sound to us! Independently of whether this is logically sound, it's important to realize that algorithms (particularly machine learning algorithms!) and people are different. Consider these points about machine learning algorithms:\n", "\n", - " - _Machine learning can create feedback loops_:: small amounts of bias can very rapidly, exponentially increase due to feedback loops\n", - " - _Machine learning can amplify bias_:: human bias can lead to larger amounts of machine learning bias\n", - " - _Algorithms & humans are used differently_:: human decision makers and algorithmic decision makers are not used in a plug-and-play interchangeable way in practice. For instance, algorithmic decisions are more likely to be implemented at scale and without a process for recourse. Furthermore, people are more likely to mistakenly believe that the result of an algorithm is objective and error-free.\n", + " - _Machine learning can create feedback loops_:: Small amounts of bias can rapidly increase exponentially due to feedback loops.\n", + " - _Machine learning can amplify bias_:: Human bias can lead to larger amounts of machine learning bias.\n", + " - _Algorithms & humans are used differently_:: Human decision makers and algorithmic decision makers are not used in a plug-and-play interchangeable way in practice.\n", " - _Technology is power_:: And with that comes responsibility.\n", "\n", - "As the Arkansas healthcare example showed, machine learning is often implemented in practice not because it leads to better outcomes, but because it is cheaper and more efficient. Cathy O'Neill, in her book *Weapons of Math Destruction*, described the pattern of how the privileged are processed by people, whereas the poor are processed by algorithms. This is just one of a number of ways that algorithms are used differently than human decision makers. Others include:\n", + "As the Arkansas healthcare example showed, machine learning is often implemented in practice not because it leads to better outcomes, but because it is cheaper and more efficient. Cathy O'Neill, in her book *Weapons of Math Destruction* (Crown), described the pattern of how the privileged are processed by people, whereas the poor are processed by algorithms. This is just one of a number of ways that algorithms are used differently than human decision makers. Others include:\n", "\n", - " - People are more likely to assume algorithms are objective or error-free (even if they’re given the option of a human override)\n", - " - Algorithms are more likely to be implemented with no appeals process in place\n", - " - Algorithms are often used at scale\n", - " - Algorithmic systems are cheap\n", + " - People are more likely to assume algorithms are objective or error-free (even if they’re given the option of a human override).\n", + " - Algorithms are more likely to be implemented with no appeals process in place.\n", + " - Algorithms are often used at scale.\n", + " - Algorithmic systems are cheap.\n", "\n", "Even in the absence of bias, algorithms (and deep learning especially, since it is such an effective and scalable algorithm) can lead to negative societal problems, such as when used for *disinformation*." ] @@ -644,47 +637,47 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Disinformation" + "### Disinformation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "*Disinformation* has a history stretching back hundreds or even thousands of years. It is not necessarily about getting someone to believe something false, but rather, often to sow disharmony and uncertainty, and to get people to give up on seeking the truth. Receiving conflicting accounts can lead people to assume that they can never know what to trust.\n", + "*Disinformation* has a history stretching back hundreds or even thousands of years. It is not necessarily about getting someone to believe something false, but rather often used to sow disharmony and uncertainty, and to get people to give up on seeking the truth. Receiving conflicting accounts can lead people to assume that they can never know whom or what to trust.\n", "\n", - "Some people think disinformation is primarily about false information or *fake news*, but in reality, disinformation can often contain seeds of truth, or involve half-truths taken out of context. Ladislav Bittman was an intelligence officer in the USSR who later defected to the United States and wrote some books in the 1970s and 1980s on the role of disinformation in Soviet propaganda operations. He said, \"Most campaigns are a carefully designed mixture of facts, half-truths, exaggerations, & deliberate lies.\"\n", + "Some people think disinformation is primarily about false information or *fake news*, but in reality, disinformation can often contain seeds of truth, or half-truths taken out of context. Ladislav Bittman was an intelligence officer in the USSR who later defected to the US and wrote some books in the 1970s and 1980s on the role of disinformation in Soviet propaganda operations. In *The KGB and Soviet Disinformation* (Pergamon) he wrote, \"Most campaigns are a carefully designed mixture of facts, half-truths, exaggerations, and deliberate lies.\"\n", "\n", - "In the United States this has hit close to home in recent years, with the FBI detailing a massive disinformation campaign linked to Russia in the 2016 US election. Understanding the disinformation that was used in this campaign is very educational. For instance, the FBI found that the Russian disinformation campaign often organized two separate fake *grass roots* protests, one for each side of an issue, and got them to protest at the same time! The Houston Chronicle reported on one of these odd events:\n", + "In the US this has hit close to home in recent years, with the FBI detailing a massive disinformation campaign linked to Russia in the 2016 election. Understanding the disinformation that was used in this campaign is very educational. For instance, the FBI found that the Russian disinformation campaign often organized two separate fake \"grass roots\" protests, one for each side of an issue, and got them to protest at the same time! The [*Houston Chronicle*](https://www.houstonchronicle.com/local/gray-matters/article/A-Houston-protest-organized-by-Russian-trolls-12625481.php) reported on one of these odd events (<>).\n", "\n", - "> : A group that called itself the \"Heart of Texas\" had organized it on social media — a protest, they said, against the \"Islamization\" of Texas. On one side of Travis Street, I found about 10 protesters. On the other side, I found around 50 counterprotesters. But I couldn't find the rally organizers. No \"Heart of Texas.\" I thought that was odd, and mentioned it in the article: What kind of group is a no-show at its own event? Now I know why. Apparently, the rally's organizers were in Saint Petersburg, Russia, at the time. \"Heart of Texas\" is one of the internet troll groups cited in Special Prosecutor Robert Mueller's recent indictment of Russians attempting to tamper with the U.S. presidential election." + "> : A group that called itself the \"Heart of Texas\" had organized it on social media—a protest, they said, against the \"Islamization\" of Texas. On one side of Travis Street, I found about 10 protesters. On the other side, I found around 50 counterprotesters. But I couldn't find the rally organizers. No \"Heart of Texas.\" I thought that was odd, and mentioned it in the article: What kind of group is a no-show at its own event? Now I know why. Apparently, the rally's organizers were in Saint Petersburg, Russia, at the time. \"Heart of Texas\" is one of the internet troll groups cited in Special Prosecutor Robert Mueller's recent indictment of Russians attempting to tamper with the U.S. presidential election." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Screenshot" + "\"Screenshot" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Disinformation often involves coordinated campaigns of inauthentic behavior. For instance, fraudulent accounts may try to make it seem like many people hold a particular viewpoint. While most of us like to think of ourselves as independent-minded, in reality we evolved to be influenced by others in our in-group, and in opposition to those in our out-group. Online discussions can influence our viewpoints, or alter the range of what we consider acceptable viewpoints. Humans are social animals, and as social animals we are extremely influenced by the people around us. Increasingly, radicalisation occurs in online environments. So influence is coming from people in the virtual space of online forums and social networks.\n", + "Disinformation often involves coordinated campaigns of inauthentic behavior. For instance, fraudulent accounts may try to make it seem like many people hold a particular viewpoint. While most of us like to think of ourselves as independent-minded, in reality we evolved to be influenced by others in our in-group, and in opposition to those in our out-group. Online discussions can influence our viewpoints, or alter the range of what we consider acceptable viewpoints. Humans are social animals, and as social animals we are extremely influenced by the people around us. Increasingly, radicalization occurs in online environments; influence is coming from people in the virtual space of online forums and social networks.\n", "\n", - "Disinformation through auto-generated text is a particularly significant issue, due to the greatly increased capability provided by deep learning. We discuss this issue in depth when we learn to create language models, in <>.\n", + "Disinformation through autogenerated text is a particularly significant issue, due to the greatly increased capability provided by deep learning. We discuss this issue in depth when we delve into creating language models, in <>.\n", "\n", - "One proposed approach is to develop some form of digital signature, to implement it in a seamless way, and to create norms that we should only trust content which has been verified. The head of the Allen Institute on AI, Oren Etzioni, wrote such a proposal in an article titled [How Will We Prevent AI-Based Forgery?](https://hbr.org/2019/03/how-will-we-prevent-ai-based-forgery): \"AI is poised to make high-fidelity forgery inexpensive and automated, leading to potentially disastrous consequences for democracy, security, and society. The specter of AI forgery means that we need to act to make digital signatures de rigueur as a means of authentication of digital content.\"\n", + "One proposed approach is to develop some form of digital signature, to implement it in a seamless way, and to create norms that we should only trust content that has been verified. The head of the Allen Institute on AI, Oren Etzioni, wrote such a proposal in an article titled [\"How Will We Prevent AI-Based Forgery?\"](https://hbr.org/2019/03/how-will-we-prevent-ai-based-forgery): \"AI is poised to make high-fidelity forgery inexpensive and automated, leading to potentially disastrous consequences for democracy, security, and society. The specter of AI forgery means that we need to act to make digital signatures de rigueur as a means of authentication of digital content.\"\n", "\n", - "Whilst we can't hope to discuss all the ethical issues that deep learning, and algorithms more generally, bring up, hopefully this brief introduction has been a useful starting point you can build on. We'll now move on to the questions of how to identify ethical issues, and what to do about them." + "Whilst we can't hope to discuss all the ethical issues that deep learning, and algorithms more generally, brings up, hopefully this brief introduction has been a useful starting point you can build on. We'll now move on to the questions of how to identify ethical issues, and what to do about them." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Identifying and addressing ethical issues" + "## Identifying and Addressing Ethical Issues" ] }, { @@ -695,19 +688,19 @@ "\n", "So what can we do? This is a big topic, but a few steps towards addressing ethical issues are:\n", "\n", - "- analyze a project you are working on\n", - "- implement processes at your company to find and address ethical risks\n", - "- support good policy\n", - "- increase diversity\n", + "- Analyze a project you are working on.\n", + "- Implement processes at your company to find and address ethical risks.\n", + "- Support good policy.\n", + "- Increase diversity.\n", "\n", - "Let's walk through each step next, starting with analyzing a project you are working on." + "Let's walk through each of these steps, starting with analyzing a project you are working on." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Analyze a project you are working on" + "### Analyze a Project You Are Working On" ] }, { @@ -726,25 +719,25 @@ "\n", "These questions may be able to help you identify outstanding issues, and possible alternatives that are easier to understand and control. In addition to asking the right questions, it's also important to consider practices and processes to implement.\n", "\n", - "One thing to consider at this stage is what data you are collecting and storing. Data often ends up being used for different purposes than what it was originally collected for. For instance, IBM began selling to Nazi Germany well before the Holocaust, including helping with Germany’s 1933 census conducted by Adolf Hitler, which was effective at identifying far more Jewish people than had previously been recognized in Germany. US census data was used to round up Japanese-Americans (who were US citizens) for internment during World War II. It is important to recognize how data and images collected can be weaponized later. Columbia professor [Tim Wu wrote](https://www.nytimes.com/2019/04/10/opinion/sunday/privacy-capitalism.html) that “You must assume that any personal data that Facebook or Android keeps are data that governments around the world will try to get or that thieves will try to steal.”" + "One thing to consider at this stage is what data you are collecting and storing. Data often ends up being used for different purposes than what it was originally collected for. For instance, IBM began selling to Nazi Germany well before the Holocaust, including helping with Germany’s 1933 census conducted by Adolf Hitler, which was effective at identifying far more Jewish people than had previously been recognized in Germany. Similarly, US census data was used to round up Japanese-Americans (who were US citizens) for internment during World War II. It is important to recognize how data and images collected can be weaponized later. Columbia professor [Tim Wu wrote](https://www.nytimes.com/2019/04/10/opinion/sunday/privacy-capitalism.html) that “You must assume that any personal data that Facebook or Android keeps are data that governments around the world will try to get or that thieves will try to steal.”" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Processes to implement" + "### Processes to Implement" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The Markkula Center has released [An Ethical Toolkit for Engineering/Design Practice](https://www.scu.edu/ethics-in-technology-practice/ethical-toolkit/), which includes some concrete practices to implement at your company, including regularly scheduled ethical risk sweeps to proactively search for ethical risks (in a manner similar to cybersecurity penetration testing), expanding the ethical circle to include the perspectives of a variety of stakeholders, and considering the terrible people (how could bad actors abuse, steal, misinterpret, hack, destroy, or weaponize what you are building?). \n", + "The Markkula Center has released [An Ethical Toolkit for Engineering/Design Practice](https://www.scu.edu/ethics-in-technology-practice/ethical-toolkit/) that includes some concrete practices to implement at your company, including regularly scheduled sweeps to proactively search for ethical risks (in a manner similar to cybersecurity penetration testing), expanding the ethical circle to include the perspectives of a variety of stakeholders, and considering the terrible people (how could bad actors abuse, steal, misinterpret, hack, destroy, or weaponize what you are building?). \n", "\n", "Even if you don't have a diverse team, you can still try to pro-actively include the perspectives of a wider group, considering questions such as these (provided by the Markkula Center):\n", "\n", - " - Whose interests, desires, skills, experiences and values have we simply assumed, rather than actually consulted?\n", + " - Whose interests, desires, skills, experiences, and values have we simply assumed, rather than actually consulted?\n", " - Who are all the stakeholders who will be directly affected by our product? How have their interests been protected? How do we know what their interests really are—have we asked?\n", " - Who/which groups and individuals will be indirectly affected in significant ways?\n", " - Who might use this product that we didn’t expect to use it, or for purposes we didn’t initially intend?" @@ -754,35 +747,35 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "#### Ethical Lenses" + "#### Ethical lenses" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Another useful resource from the Markkula Center is [Conceptual Frameworks in Technology and Engineering Practice](https://www.scu.edu/ethics-in-technology-practice/conceptual-frameworks/). This considers how different foundational ethical lenses can help identify concrete issues, and lays out the following approaches and key questions:\n", + "Another useful resource from the Markkula Center is its [Conceptual Frameworks in Technology and Engineering Practice](https://www.scu.edu/ethics-in-technology-practice/conceptual-frameworks/). This considers how different foundational ethical lenses can help identify concrete issues, and lays out the following approaches and key questions:\n", "\n", - " - The Rights Approach:: Which option best respects the rights of all who have a stake?\n", - " - The Justice Approach:: Which option treats people equally or proportionately?\n", - " - The Utilitarian Approach:: Which option will produce the most good and do the least harm?\n", - " - The Common Good Approach:: Which option best serves the community as a whole, not just some members?\n", - " - The Virtue Approach:: Which option leads me to act as the sort of person I want to be?\n", + " - The rights approach:: Which option best respects the rights of all who have a stake?\n", + " - The justice approach:: Which option treats people equally or proportionately?\n", + " - The utilitarian approach:: Which option will produce the most good and do the least harm?\n", + " - The common good approach:: Which option best serves the community as a whole, not just some members?\n", + " - The virtue approach:: Which option leads me to act as the sort of person I want to be?\n", "\n", - "Markkula's recommendations include a deeper dive into each of these perspectives, including looking at a project based on a focus on its *consequences*:\n", + "Markkula's recommendations include a deeper dive into each of these perspectives, including looking at a project through the lenses of its *consequences*:\n", "\n", " - Who will be directly affected by this project? Who will be indirectly affected?\n", " - Will the effects in aggregate likely create more good than harm, and what types of good and harm?\n", " - Are we thinking about all relevant types of harm/benefit (psychological, political, environmental, moral, cognitive, emotional, institutional, cultural)?\n", " - How might future generations be affected by this project?\n", " - Do the risks of harm from this project fall disproportionately on the least powerful in society? Will the benefits go disproportionately to the well-off?\n", - " - Have we adequately considered ‘dual-use?\n", + " - Have we adequately considered \"dual-use\"?\n", "\n", - "The alternative lens to this is the *deontological* perspective, which focuses on basic *right* and *wrong*:\n", + "The alternative lens to this is the *deontological* perspective, which focuses on basic concepts of *right* and *wrong*:\n", "\n", - " - What rights of others & duties to others must we respect?\n", - " - How might the dignity & autonomy of each stakeholder be impacted by this project?\n", - " - What considerations of trust & of justice are relevant to this design/project?\n", + " - What rights of others and duties to others must we respect?\n", + " - How might the dignity and autonomy of each stakeholder be impacted by this project?\n", + " - What considerations of trust and of justice are relevant to this design/project?\n", " - Does this project involve any conflicting moral duties to others, or conflicting stakeholder rights? How can we prioritize these?\n", "\n", "One of the best ways to help come up with complete and thoughtful answers to questions like these is to ensure that the people asking the questions are *diverse*." @@ -792,36 +785,36 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### The power of diversity" + "### The Power of Diversity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Currently, less than 12% of AI researchers are women, according to a study from Element AI. The statistics are similarly dire when it comes to race and age. When everybody on a team has similar backgrounds, they are likely to have similar blindspots around ethical risks. The Harvard Business Review (HBR) has published a number of studies showing many benefits of diverse teams, including:\n", + "Currently, less than 12% of AI researchers are women, according to [a study from Element AI](https://medium.com/element-ai-research-lab/estimating-the-gender-ratio-of-ai-researchers-around-the-world-81d2b8dbe9c3). The statistics are similarly dire when it comes to race and age. When everybody on a team has similar backgrounds, they are likely to have similar blindspots around ethical risks. The *Harvard Business Review* (HBR) has published a number of studies showing many benefits of diverse teams, including:\n", "\n", - "- [How Diversity Can Drive Innovation](https://hbr.org/2013/12/how-diversity-can-drive-innovation)\n", - "- [Teams Solve Problems Faster When They’re More Cognitively Diverse](https://hbr.org/2017/03/teams-solve-problems-faster-when-theyre-more-cognitively-diverse)\n", - "- [Why Diverse Teams Are Smarter](https://hbr.org/2016/11/why-diverse-teams-are-smarter), and\n", - "- [What Makes a Team Smarter? More Women](https://hbr.org/2011/06/defend-your-research-what-makes-a-team-smarter-more-women).\n", + "- [\"How Diversity Can Drive Innovation\"](https://hbr.org/2013/12/how-diversity-can-drive-innovation)\n", + "- [\"Teams Solve Problems Faster When They’re More Cognitively Diverse\"](https://hbr.org/2017/03/teams-solve-problems-faster-when-theyre-more-cognitively-diverse)\n", + "- [\"Why Diverse Teams Are Smarter\"](https://hbr.org/2016/11/why-diverse-teams-are-smarter), and\n", + "- [\"Defend Your Research: What Makes a Team Smarter? More Women\"](https://hbr.org/2011/06/defend-your-research-what-makes-a-team-smarter-more-women)\n", "\n", - "Diversity can lead to problems being identified earlier, and a wider range of solutions being considered. For instance, Tracy Chou was an early engineer at Quora. She [wrote of her experiences](https://qz.com/1016900/tracy-chou-leading-silicon-valley-engineer-explains-why-every-tech-worker-needs-a-humanities-education/), describing how she advocated internally for adding a feature that would allow trolls and other bad actors to be blocked. Chou recounts, “I was eager to work on the feature because I personally felt antagonized and abused on the site (gender isn’t an unlikely reason as to why)... But if I hadn’t had that personal perspective, it’s possible that the Quora team wouldn’t have prioritized building a block button so early in its existence.” Harassment often drives people from marginalised groups off online platforms, so this functionality has been important for maintaining the health of Quora's community.\n", + "Diversity can lead to problems being identified earlier, and a wider range of solutions being considered. For instance, Tracy Chou was an early engineer at Quora. She [wrote of her experiences](https://qz.com/1016900/tracy-chou-leading-silicon-valley-engineer-explains-why-every-tech-worker-needs-a-humanities-education/), describing how she advocated internally for adding a feature that would allow trolls and other bad actors to be blocked. Chou recounts, “I was eager to work on the feature because I personally felt antagonized and abused on the site (gender isn’t an unlikely reason as to why)... But if I hadn’t had that personal perspective, it’s possible that the Quora team wouldn’t have prioritized building a block button so early in its existence.” Harassment often drives people from marginalized groups off online platforms, so this functionality has been important for maintaining the health of Quora's community.\n", "\n", - "A crucial aspect to understand is that women leave the tech industry at over twice the rate that men do, according to the Harvard business review (41% of women working in tech leave, compared to 17% of men). An analysis of over 200 books, white papers, and articles found that the reason they leave is that “they’re treated unfairly; underpaid, less likely to be fast-tracked than their male colleagues, and unable to advance.” \n", + "A crucial aspect to understand is that women leave the tech industry at over twice the rate that men do, according to the [*Harvard Business Review*](https://www.researchgate.net/publication/268325574_By_RESEARCH_REPORT_The_Athena_Factor_Reversing_the_Brain_Drain_in_Science_Engineering_and_Technology) (41% of women working in tech leave, compared to 17% of men). An analysis of over 200 books, white papers, and articles found that the reason they leave is that “they’re treated unfairly; underpaid, less likely to be fast-tracked than their male colleagues, and unable to advance.” \n", "\n", - "Studies have confirmed a number of the factors that make it harder for women to advance in the workplace. Women receive more vague feedback and personality criticism in performance evaluations, whereas men receive actionable advice tied to business outcomes (which is more useful). Women frequently experience being excluded from more creative and innovative roles, and not receiving high visibility “stretch” assignments that are helpful in getting promoted. One study found that men’s voices are perceived as more persuasive, fact-based, and logical than women’s voices, even when reading identical scripts.\n", + "Studies have confirmed a number of the factors that make it harder for women to advance in the workplace. Women receive more vague feedback and personality criticism in performance evaluations, whereas men receive actionable advice tied to business outcomes (which is more useful). Women frequently experience being excluded from more creative and innovative roles, and not receiving high-visibility “stretch” assignments that are helpful in getting promoted. One study found that men’s voices are perceived as more persuasive, fact-based, and logical than women’s voices, even when reading identical scripts.\n", "\n", "Receiving mentorship has been statistically shown to help men advance, but not women. The reason behind this is that when women receive mentorship, it’s advice on how they should change and gain more self-knowledge. When men receive mentorship, it’s public endorsement of their authority. Guess which is more useful in getting promoted?\n", "\n", - "As long as qualified women keep dropping out of tech, teaching more girls to code will not solve the diversity issues plaguing the field. Diversity initiatives often end up focusing primarily on white women, even though women of colour face many additional barriers. In interviews with 60 women of color who work in STEM research, 100% had experienced discrimination." + "As long as qualified women keep dropping out of tech, teaching more girls to code will not solve the diversity issues plaguing the field. Diversity initiatives often end up focusing primarily on white women, even though women of color face many additional barriers. In [interviews](https://worklifelaw.org/publications/Double-Jeopardy-Report_v6_full_web-sm.pdf) with 60 women of color who work in STEM research, 100% had experienced discrimination." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The hiring process is particularly broken in tech. One study indicative of the disfunction comes from Triplebyte, a company that helps place software engineers in companies. They conduct a standardised technical interview as part of this process. They have a fascinating dataset: the results of how over 300 engineers did on their exam, and then the results of how those engineers did during the interview process for a variety of companies. The number one finding from [Triplebyte’s research](https://triplebyte.com/blog/who-y-combinator-companies-want) is that “the types of programmers that each company looks for often have little to do with what the company needs or does. Rather, they reflect company culture and the backgrounds of the founders.”\n", + "The hiring process is particularly broken in tech. One study indicative of the disfunction comes from Triplebyte, a company that helps place software engineers in companies, conducting a standardized technical interview as part of this process. They have a fascinating dataset: the results of how over 300 engineers did on their exam, coupled with the results of how those engineers did during the interview process for a variety of companies. The number one finding from [Triplebyte’s research](https://triplebyte.com/blog/who-y-combinator-companies-want) is that “the types of programmers that each company looks for often have little to do with what the company needs or does. Rather, they reflect company culture and the backgrounds of the founders.”\n", "\n", "This is a challenge for those trying to break into the world of deep learning, since most companies' deep learning groups today were founded by academics. These groups tend to look for people \"like them\"--that is, people that can solve complex math problems and understand dense jargon. They don't always know how to spot people who are actually good at solving real problems using deep learning.\n", "\n", @@ -832,19 +825,19 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Fairness, accountability, and transparency" + "### Fairness, Accountability, and Transparency" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The professional society for computer scientists, the ACM, runs a conference on data ethics called the \"Conference on Fairness, Accountability, and Transparency\". \"Fairness, Accountability, and Transparency\" sometimes goes under the acronym *FAT*, although nowadays it's changing to *FAccT*. Microsoft has a group focused on \"Fairness, Accountability, Transparency, and Ethics\" (FATE). The various versions of this lens have resulted in the acronym \"FAT\" seeing wide usage. In this section, we'll use \"FAccT\" to refer to the concepts of *Fairness, Accountability, and Transparency*.\n", + "The professional society for computer scientists, the ACM, runs a data ethics conference called the Conference on Fairness, Accountability, and Transparency. \"Fairness, Accountability, and Transparency\" which used to go under the acronym *FAT* but now uses to the less objectionable *FAccT*. Microsoft has a group focused on \"Fairness, Accountability, Transparency, and Ethics\" (FATE). In this section, we'll use \"FAccT\" to refer to the concepts of *Fairness, Accountability, and Transparency*.\n", "\n", - "FAccT is another lens that you may find useful in considering ethical issues. One useful resource for this is the free online book [Fairness and machine learning; Limitations and Opportunities](https://fairmlbook.org/), which \"gives a perspective on machine learning that treats fairness as a central concern rather than an afterthought.\" It also warns, however, that it \"is intentionally narrow in scope... A narrow framing of machine learning ethics might be tempting to technologists and businesses as a way to focus on technical interventions while sidestepping deeper questions about power and accountability. We caution against this temptation.\" Rather than provide an overview of the FAccT approach to ethics (which is better done in books such as the one linked above), our focus here will be on the limitations of this kind of narrow framing.\n", + "FAccT is another lens that you may find useful in considering ethical issues. One useful resource for this is the free online book [*Fairness and Machine Learning: Limitations and Opportunities*](https://fairmlbook.org/) by Solon Barocas, Moritz Hardt, and Arvind Narayanan, which \"gives a perspective on machine learning that treats fairness as a central concern rather than an afterthought.\" It also warns, however, that it \"is intentionally narrow in scope... A narrow framing of machine learning ethics might be tempting to technologists and businesses as a way to focus on technical interventions while sidestepping deeper questions about power and accountability. We caution against this temptation.\" Rather than provide an overview of the FAccT approach to ethics (which is better done in books such as that one), our focus here will be on the limitations of this kind of narrow framing.\n", "\n", - "One great way to consider whether an ethical lens is complete, is to try to come up with an example where the lens and our own ethical intuitions give diverging results. Os Keyes et al. explored this in a graphic way in their paper [A Mulching Proposal\n", - "Analysing and Improving an Algorithmic System for Turning the Elderly into High-Nutrient Slurry](https://arxiv.org/abs/1908.06166). The paper's abstract says:" + "One great way to consider whether an ethical lens is complete is to try to come up with an example where the lens and our own ethical intuitions give diverging results. Os Keyes, Jevan Hutson, and Meredith Durbin explored this in a graphic way in their paper [\"A Mulching Proposal:\n", + "Analysing and Improving an Algorithmic System for Turning the Elderly into High-Nutrient Slurry\"](https://arxiv.org/abs/1908.06166). The paper's abstract says:" ] }, { @@ -860,7 +853,7 @@ "source": [ "In this paper, the rather controversial proposal (\"Turning the Elderly into High-Nutrient Slurry\") and the results (\"drastically increase the algorithm's adherence to the FAT framework, resulting in a more ethical and beneficent system\") are at odds... to say the least!\n", "\n", - "In philosophy, and especially philosophy of ethics, this is one of the most effective tools: first, come up with a process, definition, set of questions, etc., which is designed to resolve some problem. Then try to come up with an example where that apparent solution results in a proposal that no-one would consider acceptable. This can then lead to a further refinement of the solution.\n", + "In philosophy, and especially philosophy of ethics, this is one of the most effective tools: first, come up with a process, definition, set of questions, etc., which is designed to resolve some problem. Then try to come up with an example where that apparent solution results in a proposal that no one would consider acceptable. This can then lead to a further refinement of the solution.\n", "\n", "So far, we've focused on things that you and your organization can do. But sometimes individual or organizational action is not enough. Sometimes, governments also need to consider policy implications." ] @@ -883,25 +876,27 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### The effectiveness of regulation" + "### The Effectiveness of Regulation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To look at what can cause companies to take concrete action, consider the following two examples of how Facebook has behaved. In 2018, a UN investigation found that Facebook had played a “determining role” in the ongoing genocide of the Rohingya, an ethnic minority in Mynamar that was described by UN Secretary-General Antonio Guterres as \"one of, if not the, most discriminated people in the world\". Local activists had been warning Facebook executives that their platform was being used to spread hate speech and incite violence since as early as 2013. In 2015, they were warned that Facebook could play the same role in Myanmar that the radio broadcasts played during the Rwandan genocide (where a million people were killed). Yet, by the end of 2015, Facebook only employed 4 contractors that spoke Burmese. As one person close to the matter said, \"That’s not 20/20 hindsight. The scale of this problem was significant and it was already apparent.\" Zuckerberg promised during the congressional hearings to hire \"dozens\" to address the genocide in Myanmar (in 2018, years after the genocide had begun, including the destruction by fire of at least 288 villages in northern Rakhine state after August 2017).\n", + "To look at what can cause companies to take concrete action, consider the following two examples of how Facebook has behaved. In 2018, a UN investigation found that Facebook had played a “determining role” in the ongoing genocide of the Rohingya, an ethnic minority in Mynamar described by UN Secretary-General Antonio Guterres as \"one of, if not the, most discriminated people in the world.\" Local activists had been warning Facebook executives that their platform was being used to spread hate speech and incite violence since as early as 2013. In 2015, they were warned that Facebook could play the same role in Myanmar that the radio broadcasts played during the Rwandan genocide (where a million people were killed). Yet, by the end of 2015, Facebook only employed four contractors that spoke Burmese. As one person close to the matter said, \"That’s not 20/20 hindsight. The scale of this problem was significant and it was already apparent.\" Zuckerberg promised during the congressional hearings to hire \"dozens\" to address the genocide in Myanmar (in 2018, years after the genocide had begun, including the destruction by fire of at least 288 villages in northern Rakhine state after August 2017).\n", "\n", "This stands in stark contrast to Facebook quickly [hiring 1,200 people in Germany](http://thehill.com/policy/technology/361722-facebook-opens-second-german-office-to-comply-with-hate-speech-law) to try to avoid expensive penalties (of up to 50 million euros) under a new German law against hate speech. Clearly, in this case, Facebook was more reactive to the threat of a financial penalty than to the systematic destruction of an ethnic minority.\n", "\n", - "In an [article on privacy issues](https://idlewords.com/2019/06/the_new_wilderness.htm), Maciej Ceglowski draws parallels with the environmental movement… \"This regulatory project has been so successful in the First World that we risk forgetting what life was like before it. Choking smog of the kind that today kills thousands in Jakarta and Delhi was [once emblematic of London](https://en.wikipedia.org/wiki/Pea_soup_fog). The Cuyahoga River in Ohio used to [reliably catch fire](http://www.ohiohistorycentral.org/w/Cuyahoga_River_Fire). In a particularly horrific example of unforeseen consequences, tetraethyl lead added to gasoline [raised violent crime rates](https://en.wikipedia.org/wiki/Lead%E2%80%93crime_hypothesis) worldwide for fifty years. None of these harms could have been fixed by telling people to vote with their wallet, or carefully review the environmental policies of every company they gave their business to, or to stop using the technologies in question. It took coordinated, and sometimes highly technical, regulation across jurisdictional boundaries to fix them. In some cases, like the [ban on commercial refrigerants](https://en.wikipedia.org/wiki/Montreal_Protocol) that depleted the ozone layer, that regulation required a worldwide consensus. We’re at the point where we need a similar shift in perspective in our privacy law.\"" + "In an [article on privacy issues](https://idlewords.com/2019/06/the_new_wilderness.htm), Maciej Ceglowski draws parallels with the environmental movement: \n", + "\n", + "> : This regulatory project has been so successful in the First World that we risk forgetting what life was like before it. Choking smog of the kind that today kills thousands in Jakarta and Delhi was https://en.wikipedia.org/wiki/Pea_soup_fog[once emblematic of London]. The Cuyahoga River in Ohio used to http://www.ohiohistorycentral.org/w/Cuyahoga_River_Fire[reliably catch fire]. In a particularly horrific example of unforeseen consequences, tetraethyl lead added to gasoline https://en.wikipedia.org/wiki/Lead%E2%80%93crime_hypothesis[raised violent crime rates] worldwide for fifty years. None of these harms could have been fixed by telling people to vote with their wallet, or carefully review the environmental policies of every company they gave their business to, or to stop using the technologies in question. It took coordinated, and sometimes highly technical, regulation across jurisdictional boundaries to fix them. In some cases, like the https://en.wikipedia.org/wiki/Montreal_Protocol[ban on commercial refrigerants] that depleted the ozone layer, that regulation required a worldwide consensus. We’re at the point where we need a similar shift in perspective in our privacy law." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Rights and policy" + "### Rights and Policy" ] }, { @@ -912,21 +907,23 @@ "\n", "Many of the issues we are seeing in tech are actually human rights issues, such as when a biased algorithm recommends that Black defendants have longer prison sentences, when particular job ads are only shown to young people, or when police use facial recognition to identify protesters. The appropriate venue to address human rights issues is typically through the law.\n", "\n", - "We need both regulatory and legal changes, *and* the ethical behavior of individuals. Individual behavior change can’t address misaligned profit incentives, externalities (where corporations reap large profits while off-loading their costs & harms to the broader society), or systemic failures. However, the law will never cover all edge cases, and it is important that individual software developers and data scientists are equipped to make ethical decisions in practice." + "We need both regulatory and legal changes, *and* the ethical behavior of individuals. Individual behavior change can’t address misaligned profit incentives, externalities (where corporations reap large profits while offloading their costs and harms to the broader society), or systemic failures. However, the law will never cover all edge cases, and it is important that individual software developers and data scientists are equipped to make ethical decisions in practice." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Cars: a historical precedent" + "### Cars: A Historical Precedent" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The problems we are facing are complex and there are no simple solutions. This can be discouraging, but we find hope in considering other large challenges that people have tackled throughout history. One example is the movement to increase car safety, covered as a case study in [Datasheets for Datasets](https://arxiv.org/abs/1803.09010) and in the design podcast [99% Invisible](https://99percentinvisible.org/episode/nut-behind-wheel/). Early cars had no seatbelts, metal knobs on the dashboard that could lodge in people’s skulls during a crash, regular plate glass windows that shattered in dangerous ways, and non-collapsible steering columns that impaled drivers. However, car companies were incredibly resistant to even discussing the idea of safety as something they could help address, and the widespread belief was that cars are just the way they are, and that it was the people using them who caused problems. It took consumer safety activists and advocates decades of work to even change the national conversation to consider that perhaps car companies had some responsibility which should be addressed through regulation. When the collapsible steering column was invented, it was not implemented for several years as there was no financial incentive to do so. Major car company General Motors hired private detectives to try to dig up dirt on consumer safety advocate Ralph Nader. The requirement of seatbelts, crash test dummies, and collapsible steering columns were major victories. It was only in 2011 that car companies were required to start using crash test dummies that would represent the average women, and not just average men’s bodies; prior to this, women were 40% more likely to be injured in a car crash of the same impact compared to a man. This is a vivid example of the ways that bias, policy, and technology have important consequences." + "The problems we are facing are complex, and there are no simple solutions. This can be discouraging, but we find hope in considering other large challenges that people have tackled throughout history. One example is the movement to increase car safety, covered as a case study in [\"Datasheets for Datasets\"](https://arxiv.org/abs/1803.09010) by Timnit Gebru et al. and in the design podcast [99% Invisible](https://99percentinvisible.org/episode/nut-behind-wheel/). Early cars had no seatbelts, metal knobs on the dashboard that could lodge in people’s skulls during a crash, regular plate glass windows that shattered in dangerous ways, and non-collapsible steering columns that impaled drivers. However, car companies were incredibly resistant to even discussing the idea of safety as something they could help address, and the widespread belief was that cars are just the way they are, and that it was the people using them who caused problems.\n", + "\n", + "It took consumer safety activists and advocates decades of work to even change the national conversation to consider that perhaps car companies had some responsibility which should be addressed through regulation. When the collapsible steering column was invented, it was not implemented for several years as there was no financial incentive to do so. Major car company General Motors hired private detectives to try to dig up dirt on consumer safety advocate Ralph Nader. The requirement of seatbelts, crash test dummies, and collapsible steering columns were major victories. It was only in 2011 that car companies were required to start using crash test dummies that would represent the average woman, and not just average men’s bodies; prior to this, women were 40% more likely to be injured in a car crash of the same impact compared to a man. This is a vivid example of the ways that bias, policy, and technology have important consequences." ] }, { @@ -940,11 +937,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Coming from a background of working with binary logic, the lack of clear answers in ethics can be frustrating at first. Yet, the implications of how our work impacts the world, including unintended consequences and the work becoming weaponized by bad actors, are some of the most important questions we can (and should!) consider. Even though there aren't any easy answers, there are definite pitfalls to avoid and practices to move towards more ethical behavior.\n", + "Coming from a background of working with binary logic, the lack of clear answers in ethics can be frustrating at first. Yet, the implications of how our work impacts the world, including unintended consequences and the work becoming weaponized by bad actors, are some of the most important questions we can (and should!) consider. Even though there aren't any easy answers, there are definite pitfalls to avoid and practices to follow to move toward more ethical behavior.\n", + "\n", + "Many people (including us!) are looking for more satisfying, solid answers about how to address harmful impacts of technology. However, given the complex, far-reaching, and interdisciplinary nature of the problems we are facing, there are no simple solutions. Julia Angwin, former senior reporter at ProPublica who focuses on issues of algorithmic bias and surveillance (and one of the 2016 investigators of the COMPAS recidivism algorithm that helped spark the field of FAccT) said in [a 2019 interview](https://www.fastcompany.com/90337954/who-cares-about-liberty-julia-angwin-and-trevor-paglen-on-privacy-surveillance-and-the-mess-were-in):\n", + "\n", + "> : I strongly believe that in order to solve a problem, you have to diagnose it, and that we’re still in the diagnosis phase of this. If you think about the turn of the century and industrialization, we had, I don’t know, 30 years of child labor, unlimited work hours, terrible working conditions, and it took a lot of journalist muckraking and advocacy to diagnose the problem and have some understanding of what it was, and then the activism to get laws changed. I feel like we’re in a second industrialization of data information... I see my role as trying to make as clear as possible what the downsides are, and diagnosing them really accurately so that they can be solvable. That’s hard work, and lots more people need to be doing it. \n", "\n", - "Many people (including us!) are looking for more satisfying, solid answers of how to address harmful impacts of technology. However, given the complex, far-reaching, and interdisciplinary nature of the problems we are facing, there are no simple solutions. Julia Angwin, former senior reporter at ProPublica who focuses on issues of algorithmic bias and surveillance (and one of the 2016 investigators of the COMPAS recidivism algorithm that helped spark the field of Fairness Accountability and Transparency) said in [a 2019 interview](https://www.fastcompany.com/90337954/who-cares-about-liberty-julia-angwin-and-trevor-paglen-on-privacy-surveillance-and-the-mess-were-in), “I strongly believe that in order to solve a problem, you have to diagnose it, and that we’re still in the diagnosis phase of this. If you think about the turn of the century and industrialization, we had, I don’t know, 30 years of child labor, unlimited work hours, terrible working conditions, and it took a lot of journalist muckraking and advocacy to diagnose the problem and have some understanding of what it was, and then the activism to get laws changed. I feel like we’re in a second industrialization of data information... I see my role as trying to make as clear as possible what the downsides are, and diagnosing them really accurately so that they can be solvable. That’s hard work, and lots more people need to be doing it.” It's reassuring that Angwin thinks we are largely still in the diagnosis phase: if your understanding of these problems feels incomplete, that is normal and natural. Nobody has a “cure” yet, although it is vital that we continue working to better understand and address the problems we are facing.\n", + "It's reassuring that Angwin thinks we are largely still in the diagnosis phase: if your understanding of these problems feels incomplete, that is normal and natural. Nobody has a “cure” yet, although it is vital that we continue working to better understand and address the problems we are facing.\n", "\n", - "One of our reviewers for this book, Fred Monroe, used to work in hedge fund trading. He told us, after reading this chapter, that many of the issues discussed here (distribution of data being dramatically different than what was trained on, impact of model and feedback loops once deployed and at scale, and so forth) were also key issues for building profitable trading models. The kinds of things you need to do to consider societal consequences are going to have a lot of overlap with things you need to do to consider organizational, market, and customer consequences too--so thinking carefully about ethics can also help you think carefully about how to make your data product successful more generally!" + "One of our reviewers for this book, Fred Monroe, used to work in hedge fund trading. He told us, after reading this chapter, that many of the issues discussed here (distribution of data being dramatically different than what a model was trained on, the impact feedback loops on a model once deployed and at scale, and so forth) were also key issues for building profitable trading models. The kinds of things you need to do to consider societal consequences are going to have a lot of overlap with things you need to do to consider organizational, market, and customer consequences--so thinking carefully about ethics can also help you think carefully about how to make your data product successful more generally!" ] }, { @@ -960,16 +961,16 @@ "source": [ "1. Does ethics provide a list of \"right answers\"?\n", "1. How can working with people of different backgrounds help when considering ethical questions?\n", - "1. What was the role of IBM in Nazi Germany? Why did the company participate as they did? Why did the workers participate?\n", - "1. What was the role of the first person jailed in the VW diesel scandal?\n", + "1. What was the role of IBM in Nazi Germany? Why did the company participate as it did? Why did the workers participate?\n", + "1. What was the role of the first person jailed in the Volkswagen diesel scandal?\n", "1. What was the problem with a database of suspected gang members maintained by California law enforcement officials?\n", - "1. Why did YouTube's recommendation algorithm recommend videos of partially clothed children to pedophiles, even though no employee at Google programmed this feature?\n", + "1. Why did YouTube's recommendation algorithm recommend videos of partially clothed children to pedophiles, even though no employee at Google had programmed this feature?\n", "1. What are the problems with the centrality of metrics?\n", - "1. Why did Meetup.com not include gender in their recommendation system for tech meetups?\n", + "1. Why did Meetup.com not include gender in its recommendation system for tech meetups?\n", "1. What are the six types of bias in machine learning, according to Suresh and Guttag?\n", "1. Give two examples of historical race bias in the US.\n", - "1. Where are most images in Imagenet from?\n", - "1. In the paper \"Does Machine Learning Automate Moral Hazard and Error\" why is sinusitis found to be predictive of a stroke?\n", + "1. Where are most images in ImageNet from?\n", + "1. In the paper [\"Does Machine Learning Automate Moral Hazard and Error\"](https://scholar.harvard.edu/files/sendhil/files/aer.p20171084.pdf) why is sinusitis found to be predictive of a stroke?\n", "1. What is representation bias?\n", "1. How are machines and people different, in terms of their use for making decisions?\n", "1. Is disinformation the same as \"fake news\"?\n", @@ -982,7 +983,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research:" + "### Further Research:" ] }, { @@ -990,12 +991,12 @@ "metadata": {}, "source": [ "1. Read the article \"What Happens When an Algorithm Cuts Your Healthcare\". How could problems like this be avoided in the future?\n", - "1. Research to find out more about YouTube's recommendation system and its societal impacts. Do you think recommendation systems must always have feedback loops with negative results? What approaches could Google take? What about the government?\n", - "1. Read the paper \"Discrimination in Online Ad Delivery\". Do you think Google should be considered responsible for what happened to Dr Sweeney? What would be an appropriate response?\n", + "1. Research to find out more about YouTube's recommendation system and its societal impacts. Do you think recommendation systems must always have feedback loops with negative results? What approaches could Google take to avoid them? What about the government?\n", + "1. Read the paper [\"Discrimination in Online Ad Delivery\"](https://arxiv.org/abs/1301.6822). Do you think Google should be considered responsible for what happened to Dr. Sweeney? What would be an appropriate response?\n", "1. How can a cross-disciplinary team help avoid negative consequences?\n", - "1. Read the paper \"Does Machine Learning Automate Moral Hazard and Error\" in American Economic Review. What actions do you think should be taken to deal with the issues identified in this paper?\n", + "1. Read the paper \"Does Machine Learning Automate Moral Hazard and Error\". What actions do you think should be taken to deal with the issues identified in this paper?\n", "1. Read the article \"How Will We Prevent AI-Based Forgery?\" Do you think Etzioni's proposed approach could work? Why?\n", - "1. Complete the section \"Analyze a project you are working on\" in this chapter.\n", + "1. Complete the section \"Analyze a Project You Are Working On\" in this chapter.\n", "1. Consider whether your team could be more diverse. If so, what approaches might help?" ] }, @@ -1003,26 +1004,26 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Section 1: that's a wrap!" + "## Section 1: That's a Wrap!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Congratulations! You've made it to the end of the first section of the book. In this section we've tried to show you what deep learning can do, and how you can use it to create real applications and products. At this point, you will get a lot more out of the book if you spend some time trying out what you've learnt. Perhaps you have already been doing this as you go along — in which case, great! But if not, that's no problem either… Now is a great time to start experimenting yourself.\n", + "Congratulations! You've made it to the end of the first section of the book. In this section we've tried to show you what deep learning can do, and how you can use it to create real applications and products. At this point, you will get a lot more out of the book if you spend some time trying out what you've learned. Perhaps you have already been doing this as you go along—in which case, great! If not, that's no problem either... Now is a great time to start experimenting yourself.\n", "\n", - "If you haven't been to the book website yet, head over there now. Remember, you can find it here: [book.fast.ai](https://book.fast.ai). It's really important that you have got yourself set up to run the notebooks. Becoming an effective deep learning practitioner is all about practice. So you need to be training models. So please go get the notebooks running now if you haven't already! And also have a look on the website for any important updates or notices; deep learning changes fast, and we can't change the words that are printed in this book, so the website is where you need to look to ensure you have the most up-to-date information.\n", + "If you haven't been to the [book's website](https://book.fast.ai) yet, head over there now. It's really important that you get yourself set up to run the notebooks. Becoming an effective deep learning practitioner is all about practice, so you need to be training models. So, please go get the notebooks running now if you haven't already! And also have a look on the website for any important updates or notices; deep learning changes fast, and we can't change the words that are printed in this book, so the website is where you need to look to ensure you have the most up-to-date information.\n", "\n", "Make sure that you have completed the following steps:\n", "\n", - "- Connected to one of the GPU Jupyter servers recommended on the book website\n", - "- Run the first notebook yourself\n", - "- Uploaded an image that you find in the first notebook; then try a few different images of different kinds to see what happens\n", - "- Run the second notebook, collecting your own dataset based on image search queries that you come up with\n", - "- Thought about how you can use deep learning to help you with your own projects, including what kinds of data you could use, what kinds of problems may come up, and how you might be able to mitigate these issues in practice.\n", + "- Connect to one of the GPU Jupyter servers recommended on the book's website.\n", + "- Run the first notebook yourself.\n", + "- Upload an image that you find in the first notebook; then try a few different images of different kinds to see what happens.\n", + "- Run the second notebook, collecting your own dataset based on image search queries that you come up with.\n", + "- Think about how you can use deep learning to help you with your own projects, including what kinds of data you could use, what kinds of problems may come up, and how you might be able to mitigate these issues in practice.\n", "\n", - "In the next section of the book we will learn about how and why deep learning works, instead of just seeing how we can use it in practice. Understanding the how and why is important for both practitioners and researchers, because in this fairly new field nearly every project requires some level of customisation and debugging. The better you understand the foundations of deep learning, the better your models will be. These foundations are less important for executives, product managers, and so forth (although still useful, so feel free to keep reading!), but they are critical for anybody who is actually training and deploying models themselves." + "In the next section of the book you will learn about how and why deep learning works, instead of just seeing how you can use it in practice. Understanding the how and why is important for both practitioners and researchers, because in this fairly new field nearly every project requires some level of customization and debugging. The better you understand the foundations of deep learning, the better your models will be. These foundations are less important for executives, product managers, and so forth (although still useful, so feel free to keep reading!), but they are critical for anybody who is actually training and deploying models themselves." ] }, { diff --git a/04_mnist_basics.ipynb b/04_mnist_basics.ipynb index 466aa06eb..20fae3cd0 100644 --- a/04_mnist_basics.ipynb +++ b/04_mnist_basics.ipynb @@ -24,7 +24,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Under the hood: training a digit classifier" + "# Under the Hood: Training a Digit Classifier" ] }, { @@ -33,45 +33,45 @@ "source": [ "Having seen what it looks like to actually train a variety of models in Chapter 2, let’s now look under the hood and see exactly what is going on. We’ll start by using computer vision to introduce fundamental tools and concepts for deep learning.\n", "\n", - "To be exact, we'll discuss the role of arrays and tensors, and of broadcasting, a powerful technique for using them expressively. We'll explain stochastic gradient descent (SGD), the mechanism for learning by updating weights automatically. We'll discuss the choice of a loss function for our basic classification task, and the role of mini-batches. We'll also describe the math that a basic neural network is actually doing. Finally, we'll put all these pieces together.\n", + "To be exact, we'll discuss the roles of arrays and tensors and of broadcasting, a powerful technique for using them expressively. We'll explain stochastic gradient descent (SGD), the mechanism for learning by updating weights automatically. We'll discuss the choice of a loss function for our basic classification task, and the role of mini-batches. We'll also describe the math that a basic neural network is actually doing. Finally, we'll put all these pieces together.\n", "\n", "In future chapters we’ll do deep dives into other applications as well, and see how these concepts and tools generalize. But this chapter is about laying foundation stones. To be frank, that also makes this one of the hardest chapters, because of how these concepts all depend on each other. Like an arch, all the stones need to be in place for the structure to stay up. Also like an arch, once that happens, it's a powerful structure that can support other things. But it requires some patience to assemble.\n", "\n", - "So let us begin. The first step is to consider how images are represented in a computer." + "Let's begin. The first step is to consider how images are represented in a computer." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Pixels: the foundations of computer vision" + "## Pixels: The Foundations of Computer Vision" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In order to understand what happens in a computer vision model, we first have to understand how computers handle images. We'll use one of the most famous datasets in computer vision, [MNIST](https://en.wikipedia.org/wiki/MNIST_database), for our experiments. MNIST contains hand-written digits, collected by the National Institute of Standards and Technology, and collated into a machine learning dataset by Yann Lecun and his colleagues. Lecun used MNIST in 1998 to demonstrate [Lenet 5](https://yann.lecun.com/exdb/lenet/), the first computer system to demonstrate practically useful recognition of hand-written digit sequences. This was one of the most important breakthroughs in the history of AI." + "In order to understand what happens in a computer vision model, we first have to understand how computers handle images. We'll use one of the most famous datasets in computer vision, [MNIST](https://en.wikipedia.org/wiki/MNIST_database), for our experiments. MNIST contains images of handwritten digits, collected by the National Institute of Standards and Technology and collated into a machine learning dataset by Yann Lecun and his colleagues. Lecun used MNIST in 1998 in [Lenet-5](http://yann.lecun.com/exdb/lenet/), the first computer system to demonstrate practically useful recognition of handwritten digit sequences. This was one of the most important breakthroughs in the history of AI." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Sidebar: Tenacity and deep learning" + "## Sidebar: Tenacity and Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The story of deep learning is one of tenacity and grit from a handful of dedicated researchers. After early hopes (and hype!) neural networks went out of favor in the 1990's and 2000's, and just a handful of researchers kept trying to make them work well. Three of them, Yann Lecun, Yoshua Bengio and Geoffrey Hinton were awarded the highest honor in computer science, the Turing Award (generally considered the \"Nobel Prize of computer science\") after triumphing despite the deep skepticism and disinterest of the wider machine learning and statistics community.\n", + "The story of deep learning is one of tenacity and grit by a handful of dedicated researchers. After early hopes (and hype!) neural networks went out of favor in the 1990's and 2000's, and just a handful of researchers kept trying to make them work well. Three of them, Yann Lecun, Yoshua Bengio, and Geoffrey Hinton, were awarded the highest honor in computer science, the Turing Award (generally considered the \"Nobel Prize of computer science\"), in 2018 after triumphing despite the deep skepticism and disinterest of the wider machine learning and statistics community.\n", "\n", - "Geoff Hinton has told of how even academic papers showing dramatically better results than anything previously published would be rejected from top journals and conferences, just because they used a neural network. Yann Lecun's work on convolutional neural networks, which we will study in the next section, showed that these models could read hand-written text--something that had never been achieved before. However his breakthrough was ignored by most researchers, even as it was used commercially to read 10% of the checks in the US!\n", + "Geoff Hinton has told of how even academic papers showing dramatically better results than anything previously published would be rejected by top journals and conferences, just because they used a neural network. Yann Lecun's work on convolutional neural networks, which we will study in the next section, showed that these models could read handwritten text--something that had never been achieved before. However, his breakthrough was ignored by most researchers, even as it was used commercially to read 10% of the checks in the US!\n", "\n", - "In addition to these three Turing Award winners, there are many other researchers who have battled to get us to where we are today. For instance, Jurgen Schmidhuber (who many believe should have shared in the Turing Award) pioneered many important ideas, including working with his student Sepp Hochreiter on the *LSTM* architecture (widely used for speech recognition and other text modeling tasks, and used in the IMDB example in <>). Perhaps most important of all, Paul Werbos in 1974 invented back-propagation for neural networks, the technique shown in this chapter and used universally for training neural networks ([Werbos 1994](https://books.google.com/books/about/The_Roots_of_Backpropagation.html?id=WdR3OOM2gBwC)). His development was almost entirely ignored for decades, but today it is the most important foundation of modern AI.\n", + "In addition to these three Turing Award winners, there are many other researchers who have battled to get us to where we are today. For instance, Jurgen Schmidhuber (who many believe should have shared in the Turing Award) pioneered many important ideas, including working with his student Sepp Hochreiter on the long short-term memory (LSTM) architecture (widely used for speech recognition and other text modeling tasks, and used in the IMDb example in <>). Perhaps most important of all, Paul Werbos in 1974 invented back-propagation for neural networks, the technique shown in this chapter and used universally for training neural networks ([Werbos 1994](https://books.google.com/books/about/The_Roots_of_Backpropagation.html?id=WdR3OOM2gBwC)). His development was almost entirely ignored for decades, but today it is considered the most important foundation of modern AI.\n", "\n", - "There is a lesson here for all of us! On your deep learning journey you will face many obstacles, both technical, and (even more difficult) people around you who don't believe you'll be successful. There's one *guaranteed* way to fail, and that's to stop trying. We've seen that the only consistent trait amongst every fast.ai student that's gone on to be a world-class practitioner is that they are all very tenacious." + "There is a lesson here for all of us! On your deep learning journey you will face many obstacles, both technical, and (even more difficult) posed by people around you who don't believe you'll be successful. There's one *guaranteed* way to fail, and that's to stop trying. We've seen that the only consistent trait amongst every fast.ai student that's gone on to be a world-class practitioner is that they are all very tenacious." ] }, { @@ -85,7 +85,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For this initial tutorial we are just going to try to create a model that can classify any image as a \"3\" or a \"7\". So let's download a sample of MNIST which contains images of just these digits:" + "For this initial tutorial we are just going to try to create a model that can classify any image as a 3 or a 7. So let's download a sample of MNIST that contains images of just these digits:" ] }, { @@ -111,7 +111,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see what's in this directory by using `ls()`, a method added by fastai. This method returns an object of a special fastai class called `L`, which has all the same functionality of Python's built-in `list`, plus a lot more. One of its handy features is that, when printed, it displays the count of items, before listing the items themselves (if there's more than 10 items, it just shows the first few)." + "We can see what's in this directory by using `ls`, a method added by fastai. This method returns an object of a special fastai class called `L`, which has all the same functionality of Python's built-in `list`, plus a lot more. One of its handy features is that, when printed, it displays the count of items, before listing the items themselves (if there are more than 10 items, it just shows the first few):" ] }, { @@ -138,7 +138,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The MNIST dataset shows a very common layout for machine learning datasets: separate folders for the *training set*, which is used to train a model, and the *validation set* (and/or *test set*), which is used to evaluate the model (we'll be talking a lot about these concepts very soon!) Let's see what's inside the training set:" + "The MNIST dataset follows a common layout for machine learning datasets: separate folders for the training set and the validation set (and/or test set). Let's see what's inside the training set:" ] }, { @@ -165,7 +165,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There's a folder of \"3\"s, and a folder of \"7\"s. In machine learning parlance, we say that \"3\" and \"7\" are the *labels* (or targets) in this dataset. Let's take a look in one of these folders (using `sorted` to ensure we all get the same order of files):" + "There's a folder of 3s, and a folder of 7s. In machine learning parlance, we say that \"3\" and \"7\" are the *labels* (or targets) in this dataset. Let's take a look in one of these folders (using `sorted` to ensure we all get the same order of files):" ] }, { @@ -194,7 +194,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we might expect, it's full of image files. Let’s take a look at one now. Here’s an image of a handwritten number ‘3’, taken from the famous MNIST dataset of handwritten numbers:" + "As we might expect, it's full of image files. Let’s take a look at one now. Here’s an image of a handwritten number 3, taken from the famous MNIST dataset of handwritten numbers:" ] }, { @@ -226,7 +226,7 @@ "source": [ "Here we are using the `Image` class from the *Python Imaging Library* (PIL), which is the most widely used Python package for opening, manipulating, and viewing images. Jupyter knows about PIL images, so it displays the image for us automatically.\n", "\n", - "In a computer, everything is represented as a number. To view the numbers that make up this image, we have to convert it to a *NumPy array* or a *PyTorch tensor*. For instance, here's a few numbers from the top-left of the image, converted to a NumPy array:" + "In a computer, everything is represented as a number. To view the numbers that make up this image, we have to convert it to a *NumPy array* or a *PyTorch tensor*. For instance, here's what a section of the image looks like, converted to a NumPy array:" ] }, { @@ -258,7 +258,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and the same thing as a PyTorch tensor:" + "The `4:10` indicates we requested the rows from index 4 (included) to 10 (not included) and the same for the columns. NumPy indexes from top to bottom and left to right, so this section is located in the top-left corner of the image. Here's the same thing as a PyTorch tensor:" ] }, { @@ -290,7 +290,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can slice the array to pick just a part with the top of the digit in it, and then use a Pandas DataFrame to color-code the values using a gradient, which shows us clearly how the image is created from the pixel values:" + "We can slice the array to pick just the part with the top of the digit in it, and then use a Pandas DataFrame to color-code the values using a gradient, which shows us clearly how the image is created from the pixel values:" ] }, { @@ -1355,32 +1355,32 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can see that the background white pixels are stored as the number zero, black is the number 255, and shades of grey are between the two. The entire image contains 28 pixels across and 28 pixels down, for a total of 768 pixels. (This is much smaller than an image that you would get from a phone camera, which has millions of pixels, but is a convenient size for our initial learning and experiments. We will build up to bigger, full-colour images soon.)\n", + "You can see that the background white pixels are stored as the number 0, black is the number 255, and shades of gray are between the two. The entire image contains 28 pixels across and 28 pixels down, for a total of 768 pixels. (This is much smaller than an image that you would get from a phone camera, which has millions of pixels, but is a convenient size for our initial learning and experiments. We will build up to bigger, full-color images soon.)\n", "\n", - "So, now you've seen what an image looks like to a computer, let's recall our goal: create a model that can recognise “3”s and “7”s. How might you go about getting a computer to do that?\n", + "So, now you've seen what an image looks like to a computer, let's recall our goal: create a model that can recognize 3s and 7s. How might you go about getting a computer to do that?\n", "\n", - "> stop: Before you read on, take a moment to think about how a computer might be able to recognize these two different digits. What kind of features might it be able to look at? How might it be able to identify these features? How could it combine them together? Learning works best when you try to solve problems yourself, rather than just reading somebody else's answers; so step away from this book for a few minutes, grab a piece of paper and pen, and jot some ideas down…" + "> Warning: Stop and Think!: Before you read on, take a moment to think about how a computer might be able to recognize these two different digits. What kinds of features might it be able to look at? How might it be able to identify these features? How could it combine them together? Learning works best when you try to solve problems yourself, rather than just reading somebody else's answers; so step away from this book for a few minutes, grab a piece of paper and pen, and jot some ideas down…" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## First try: pixel similarity" + "## First Try: Pixel Similarity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "So, here is a first idea: how about we find the average pixel value for every pixel of the threes and do the same for each of the sevens. This will give us two group averages, defining what we might call the \"ideal\" 3 and 7. Then, to classify an image as digit, we see which of these two ideal digits the image is most similar to. This certainly seems like it should be better than nothing, so it will make a good baseline." + "So, here is a first idea: how about we find the average pixel value for every pixel of the 3s, then do the same for the 7s. This will give us two group averages, defining what we might call the \"ideal\" 3 and 7. Then, to classify an image as one digit or the other, we see which of these two ideal digits the image is most similar to. This certainly seems like it should be better than nothing, so it will make a good baseline." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> note: A _baseline_ is a simple model which you are confident should perform reasonably well. It should be very simple to implement, and very easy to test, so that you can then test each of your improved ideas, and make sure they are always better than your baseline. Without starting with a sensible baseline, it is very difficult to know whether your super fancy models are actually any good. One good approach to creating a baseline is doing what we have done here: think of a simple, easy to implement model. Another good approach is to search around to find other people that have solved similar problems to yours, and download and run their code on your dataset. Ideally, try both of these!" + "> jargon: Baseline: A simple model which you are confident should perform reasonably well. It should be very simple to implement, and very easy to test, so that you can then test each of your improved ideas, and make sure they are always better than your baseline. Without starting with a sensible baseline, it is very difficult to know whether your super-fancy models are actually any good. One good approach to creating a baseline is doing what we have done here: think of a simple, easy-to-implement model. Another good approach is to search around to find other people that have solved similar problems to yours, and download and run their code on your dataset. Ideally, try both of these!" ] }, { @@ -1389,9 +1389,9 @@ "source": [ "Step one for our simple model is to get the average of pixel values for each of our two groups. In the process of doing this, we will learn a lot of neat Python numeric programming tricks!\n", "\n", - "Let's create a tensor containing all of our threes stacked together. We already know how to create a tensor containing a single image. To create a tensor containing all the images in a directory, we will first use a Python list comprehension to create a plain list of the single image tensors.\n", + "Let's create a tensor containing all of our 3s stacked together. We already know how to create a tensor containing a single image. To create a tensor containing all the images in a directory, we will first use a Python list comprehension to create a plain list of the single image tensors.\n", "\n", - "We will use Jupyter to do some little checks of our work along the way -- in this case, making sure that the number of returned items seems reasonable:" + "We will use Jupyter to do some little checks of our work along the way--in this case, making sure that the number of returned items seems reasonable:" ] }, { @@ -1420,14 +1420,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> note: List and dictionary comprehensions are a wonderful feature of Python. Many Python programmers use them every day, including all of the authors of this book—they are part of \"idiomatic Python\". But programmers coming from other languages may have never seen them before. There are a lot of great tutorials just a web search away, so we won't spend a long time discussing them now. Here is a quick explanation and example to get you started. A list comprehension looks like this: `new_list = [f(o) for o in a_list if o>0]`. This would return every element of `a_list` that is greater than zero, after passing it to the function `f`. There are three parts here: the collection you are iterating over (`a_list`), an optional filter (`if o>0`), and something to do to each element (`f(o)`). It's not only shorter to write but way faster than the alternative ways of creating the same list with a loop." + "> note: List Comprehensions: List and dictionary comprehensions are a wonderful feature of Python. Many Python programmers use them every day, including the authors of this book—they are part of \"idiomatic Python.\" But programmers coming from other languages may have never seen them before. There are a lot of great tutorials just a web search away, so we won't spend a long time discussing them now. Here is a quick explanation and example to get you started. A list comprehension looks like this: `new_list = [f(o) for o in a_list if o>0]`. This will return every element of `a_list` that is greater than 0, after passing it to the function `f`. There are three parts here: the collection you are iterating over (`a_list`), an optional filter (`if o>0`), and something to do to each element (`f(o)`). It's not only shorter to write but way faster than the alternative ways of creating the same list with a loop." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We'll also check that one of the images looks okay. Since we now have tensors (which Jupyter by default will print as values), rather than PIL images (which Jupyter by default will display as an image), we need to use fastai's `show_image` function to display it:" + "We'll also check that one of the images looks okay. Since we now have tensors (which Jupyter by default will print as values), rather than PIL images (which Jupyter by default will display as images), we need to use fastai's `show_image` function to display it:" ] }, { @@ -1456,11 +1456,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For every pixel position, we want to compute the average over all the images of the intensity of that pixel. To do this we first combine all the images in this list into a single three-dimensional tensor. The most common way to describe such a tensor is to call it a *rank-3 tensor*. We often need to stack up individual tensors in a collection into a single tensor. Unsurprisingly, PyTorch comes with a function called `stack`.\n", + "For every pixel position, we want to compute the average over all the images of the intensity of that pixel. To do this we first combine all the images in this list into a single three-dimensional tensor. The most common way to describe such a tensor is to call it a *rank-3 tensor*. We often need to stack up individual tensors in a collection into a single tensor. Unsurprisingly, PyTorch comes with a function called `stack` that we can use for this purpose.\n", "\n", - "Some operations in PyTorch, such as taking a mean, require us to cast our integer types to float types. Since we'll be needing this later, we'll also cast our stacked tensor to `float` now. Casting in PyTorch is as simple as typing the name of the type you wish to cast to, and treating it as a method.\n", + "Some operations in PyTorch, such as taking a mean, require us to *cast* our integer types to float types. Since we'll be needing this later, we'll also cast our stacked tensor to `float` now. Casting in PyTorch is as simple as typing the name of the type you wish to cast to, and treating it as a method.\n", "\n", - "Generally when images are floats, the pixels are expected to be between zero and one, so we will also divide by 255 here." + "Generally when images are floats, the pixel values are expected to be between 0 and 1, so we will also divide by 255 here:" ] }, { @@ -1489,9 +1489,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Perhaps the most important attribute of a tensor is its shape. This tells you the length of each axis. In this case, we can see that we have 6131 images, each of size 28 x 28 pixels. There is nothing specifically about this tensor that says that the first axis is the number of images, the second is the height, and the third is the width — the semantics of a tensor are entirely up to us, and how we construct it. As far as PyTorch is concerned, it is just a bunch of numbers in memory.\n", + "Perhaps the most important attribute of a tensor is its *shape*. This tells you the length of each axis. In this case, we can see that we have 6,131 images, each of size 28\\*28 pixels. There is nothing specifically about this tensor that says that the first axis is the number of images, the second is the height, and the third is the width—the semantics of a tensor are entirely up to us, and how we construct it. As far as PyTorch is concerned, it is just a bunch of numbers in memory.\n", "\n", - "The length of a tensor's shape is its rank." + "The *length* of a tensor's shape is its rank:" ] }, { @@ -1520,14 +1520,14 @@ "source": [ "It is really important for you to commit to memory and practice these bits of tensor jargon: _rank_ is the number of axes or dimensions in a tensor; _shape_ is the size of each axis of a tensor.\n", "\n", - "> A: Watch out because the term \"dimension\" is sometimes used in two ways. Consider that we live in \"three dimensonal space\" where a physical position can be described by a 3-vector `v`. But according to PyTorch, the attribute `v.ndim` (which sure looks like the \"number of dimensions\" of `v`) equals one not three! Why? Because v is a vector, which is a tensor of rank one, meaning that it has only one _axis_ (even if that axis has a length of three). In other words, sometimes dimension is used for the size of an axis (\"space is 3-dimensional\"); other times, it is used for the rank, or the number of axes (\"a matrix has two dimensions\"). When confused, I find it helpful to translate all statements into the terms of rank, axis, and length, which are unambiguous terms." + "> A: Watch out because the term \"dimension\" is sometimes used in two ways. Consider that we live in \"three-dimensonal space\" where a physical position can be described by a 3-vector `v`. But according to PyTorch, the attribute `v.ndim` (which sure looks like the \"number of dimensions\" of `v`) equals one, not three! Why? Because `v` is a vector, which is a tensor of rank one, meaning that it has only one _axis_ (even if that axis has a length of three). In other words, sometimes dimension is used for the size of an axis (\"space is three-dimensional\"); other times, it is used for the rank, or the number of axes (\"a matrix has two dimensions\"). When confused, I find it helpful to translate all statements into terms of rank, axis, and length, which are unambiguous terms." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "You can also get a tensor's rank directly with `ndim`." + "We can also get a tensor's rank directly with `ndim`:" ] }, { @@ -1554,9 +1554,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Finally, we can compute what the ideal three looks like. We calculate the mean of all the image tensors, by taking the mean along dimension zero of our stacked, rank-3 tensor. This is the dimension which indexes over all the images.\n", + "Finally, we can compute what the ideal 3 looks like. We calculate the mean of all the image tensors by taking the mean along dimension 0 of our stacked, rank-3 tensor. This is the dimension that indexes over all the images.\n", "\n", - "In other words, for every pixel position, this will compute the average of that pixel over all images. So the result will be one value for every pixel position -- in other words, a single image. Here it is:" + "In other words, for every pixel position, this will compute the average of that pixel over all images. The result will be one value for every pixel position, or a single image. Here it is:" ] }, { @@ -1586,9 +1586,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "According to this dataset, this is the ideal number three! (You may not like it, but this is what peak number 3 performance looks like.) You can see how it's very dark where all the images agree it should be dark, but it becomes wispy and blurry where the images disagree. \n", + "According to this dataset, this is the ideal number 3! (You may not like it, but this is what peak number 3 performance looks like.) You can see how it's very dark where all the images agree it should be dark, but it becomes wispy and blurry where the images disagree. \n", "\n", - "Let's do the same thing for the sevens, but let's put all the steps together at once to save some time:" + "Let's do the same thing for the 7s, but put all the steps together at once to save some time:" ] }, { @@ -1618,11 +1618,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's now pick an arbitrary \"3\", and measure its *distance* from each of these \"ideal digits\".\n", + "Let's now pick an arbitrary 3 and measure its *distance* from our \"ideal digits.\"\n", "\n", - "> stop: How would you calculate how similar a particular image is from each of our ideal digits? Remember to step away from this book and jot down some ideas, before you move on! Research shows that recall and understanding improves dramatically when you are *engaged* with the learning process by solving problems, experimenting, and trying new ideas yourself\n", + "> stop: Stop and Think!: How would you calculate how similar a particular image is to each of our ideal digits? Remember to step away from this book and jot down some ideas before you move on! Research shows that recall and understanding improves dramatically when you are engaged with the learning process by solving problems, experimenting, and trying new ideas yourself\n", "\n", - "Here's a sample \"3\":" + "Here's a sample 3:" ] }, { @@ -1652,16 +1652,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can't just add up the differences between the pixels of this image and the ideal digit. Why not?...\n", + "How can we determine its distance from our ideal 3? We can't just add up the differences between the pixels of this image and the ideal digit. Some differences will be positive while others will be negative, and these differences will cancel out, resulting in a situation where an image that is too dark in some places and too light in others might be shown as having zero total differences from the ideal. That would be misleading!\n", "\n", - "Because some differences will be positive, some will be negative, and these differences cancel out, resulting in a situation where an image which is too dark in some places and too light in others might be shown as having zero total differences from the ideal. That would be misleading!\n", + "To avoid this, there are two main ways data scientists measure distance in this context:\n", "\n", - "To avoid this, there's two main ways data scientists measure *distance* in this context:\n", + "- Take the mean of the *absolute value* of differences (absolute value is the function that replaces negative values with positive values). This is called the *mean absolute difference* or *L1 norm*\n", + "- Take the mean of the *square* of differences (which makes everything positive) and then take the *square root* (which undoes the squaring). This is called the *root mean squared error* (RMSE) or *L2 norm*.\n", "\n", - "- Take the mean of the *absolute value* of differences (_absolute value_ is the function that replaces negative values with positive values). This is called the *mean absolute difference* or *L1 norm*\n", - "- Take the mean of the *square* of differences (which makes everything positive) and then take the *square root* (which *undoes* the squaring). This is called the *root mean squared error (RMSE)* or *L2 norm*.\n", - "\n", - "> important: In this book we generally assume that you have completed high school maths, and remember at least some of it... But everybody forgets some things! It all depends on what you happen to have had reason to practice in the meantime. Perhaps you have forgotten what a _square root_ is, or exactly how they work. No problem! Any time you come across a maths concept that is not explained fully in this book, don't just keep moving on, but instead stop and look it up. Make sure you understand the basic idea of what that the maths concept is, how it works, and why we might be using it. One of the best places to refresh your understanding is Khan Academy. For instance, Khan Academy has a great [introduction to square roots](https://www.khanacademy.org/math/algebra/x2f8bb11595b61c86:rational-exponents-radicals/x2f8bb11595b61c86:radicals/v/understanding-square-roots)." + "> important: It's Okay to Have Forgotten Your Math: In this book we generally assume that you have completed high school math, and remember at least some of it... But everybody forgets some things! It all depends on what you happen to have had reason to practice in the meantime. Perhaps you have forgotten what a _square root_ is, or exactly how they work. No problem! Any time you come across a maths concept that is not explained fully in this book, don't just keep moving on; instead, stop and look it up. Make sure you understand the basic idea, how it works, and why we might be using it. One of the best places to refresh your understanding is Khan Academy. For instance, Khan Academy has a great [introduction to square roots](https://www.khanacademy.org/math/algebra/x2f8bb11595b61c86:rational-exponents-radicals/x2f8bb11595b61c86:radicals/v/understanding-square-roots)." ] }, { @@ -1719,21 +1717,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In both cases, the distance between our `3` and the \"ideal\" `3` is less than the distance to the ideal `7`. So our simple model will give the right prediction in this case." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "> S: Intuitively, the difference between L1 norm and mean squared error (*MSE*) is that the latter will penalize bigger mistakes more heavily than the former (and be more lenient with small mistakes)." + "In both cases, the distance between our 3 and the \"ideal\" 3 is less than the distance to the ideal 7. So our simple model will give the right prediction in this case." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "PyTorch already provides both of these as *loss functions*. You'll find these inside `torch.nn.functional`, which the PyTorch team recommends importing as `F` (and is available by default under that name in fastai). Here *MSE* stands for *mean squared error*, and *L1* refers to the standard mathematical jargon for *mean absolute value* (in math it's called the *L1 norm*)." + "PyTorch already provides both of these as *loss functions*. You'll find these inside `torch.nn.functional`, which the PyTorch team recommends importing as `F` (and is available by default under that name in fastai):" ] }, { @@ -1760,30 +1751,44 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> J: When I first came across this \"L1\" thingie, I looked it up to see what on Earth it meant, found on Google that it is a *vector norm* using *absolute value*, so looked up *vector norm* and started reading: *Given a vector space V over a field F of the real or complex numbers, a norm on V is a nonnegative-valued any function p: V → \\[0,+∞) with the following properties: For all a ∈ F and all u, v ∈ V, p(u + v) ≤ p(u) + p(v)...* Then I stopped reading. \"Ugh, I'll never understand math!\" I thought, for the thousandth time. Since then I've learned that every time these complex mathy bits of jargon come up in practice, it turns out I can replace them with a tiny bit of code! Like the _L1 loss_ is just equal to `(a-b).abs().mean()`, where `a` and `b` are tensors. I guess mathy folks just think differently to me... I'll make sure, in this book, every time some mathy jargon comes up, I'll give you the little bit of code it's equal to as well, and explain in common sense terms what's going on." + "Here `mse` stands for *mean squared error*, and `l1` refers to the standard mathematical jargon for *mean absolute value* (in math it's called the *L1 norm*)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "> S: Intuitively, the difference between L1 norm and mean squared error (MSE) is that the latter will penalize bigger mistakes more heavily than the former (and be more lenient with small mistakes)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "> J: When I first came across this \"L1\" thingie, I looked it up to see what on earth it meant. I found on Google that it is a _vector norm_ using _absolute value_, so looked up _vector norm_ and started reading: _Given a vector space V over a field F of the real or complex numbers, a norm on V is a nonnegative-valued any function p: V → \\[0,+∞) with the following properties: For all a ∈ F and all u, v ∈ V, p(u + v) ≤ p(u) + p(v)..._ Then I stopped reading. \"Ugh, I'll never understand math!\" I thought, for the thousandth time. Since then I've learned that every time these complex mathy bits of jargon come up in practice, it turns out I can replace them with a tiny bit of code! Like, the _L1 loss_ is just equal to `(a-b).abs().mean()`, where `a` and `b` are tensors. I guess mathy folks just think differently than me... I'll make sure in this book that every time some mathy jargon comes up, I'll give you the little bit of code it's equal to as well, and explain in common-sense terms what's going on." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In the above code we completed various mathematical operations on *PyTorch tensors*. If you've done some numeric programming in PyTorch before, you may recognize these as being similar to *Numpy arrays*. Let's have a look at those two very important classes." + "We just completed various mathematical operations on PyTorch tensors. If you've done some numeric programming in PyTorch before, you may recognize these as being similar to NumPy arrays. Let's have a look at those two very important data structures." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### NumPy arrays and PyTorch tensors" + "### NumPy Arrays and PyTorch Tensors" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "[Numpy](https://numpy.org/) is the most widely used library for scientific and numeric programming in Python, and provides very similar functionality and a very similar API to that provided by PyTorch; however, it does not support using the GPU, or calculating gradients, which are both critical for deep learning. Therefore, in this book we will generally use PyTorch tensors instead of NumPy arrays, where possible.\n", + "[NumPy](https://numpy.org/) is the most widely used library for scientific and numeric programming in Python. It provides very similar functionality and a very similar API to that provided by PyTorch; however, it does not support using the GPU or calculating gradients, which are both critical for deep learning. Therefore, in this book we will generally use PyTorch tensors instead of NumPy arrays, where possible.\n", "\n", - "(Note that fastai adds some features to NumPy and PyTorch to make them a bit more similar to each other. If any code in this book doesn't work on your computer, it's possible that you forgot to include a line at the start of your notebook such as: `from fastai.vision.all import *`.)\n", + "(Note that fastai adds some features to NumPy and PyTorch to make them a bit more similar to each other. If any code in this book doesn't work on your computer, it's possible that you forgot to include a line like this at the start of your notebook: `from fastai.vision.all import *`.)\n", "\n", "But what are arrays and tensors, and why should you care?" ] @@ -1792,15 +1797,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Python is slow compared to many languages. Anything fast in Python, NumPy or PyTorch is likely to be a wrapper to a compiled object written (and optimised) in another language - specifically C. In fact, **NumPy arrays and PyTorch tensors can finish computations many thousands of times faster than using pure Python.**\n", + "Python is slow compared to many languages. Anything fast in Python, NumPy, or PyTorch is likely to be a wrapper for a compiled object written (and optimized) in another language--specifically C. In fact, **NumPy arrays and PyTorch tensors can finish computations many thousands of times faster than using pure Python.**\n", "\n", - "A NumPy array is a multidimensional table of data, with all items of the same type. Since that can be any type at all, they could even be arrays of arrays, with the innermost arrays potentially being different sizes — this is called a \"jagged array\". By \"multidimensional table\" we mean, for instance, a list (dimension of one), a table or matrix (dimension of two), a \"table of tables\" or a \"cube\" (dimension of three), and so forth. If the items are all of some simple type such as an integer or a float then NumPy will store them as a compact C data structure in memory. This is where NumPy shines. Numpy has a wide variety of operators and methods which can run computations on these compact structures at the same speed as optimized C, because they are written in optimized C.\n", + "A NumPy array is a multidimensional table of data, with all items of the same type. Since that can be any type at all, they can even be arrays of arrays, with the innermost arrays potentially being different sizes—this is called a \"jagged array.\" By \"multidimensional table\" we mean, for instance, a list (dimension of one), a table or matrix (dimension of two), a \"table of tables\" or \"cube\" (dimension of three), and so forth. If the items are all of some simple type such as integer or float, then NumPy will store them as a compact C data structure in memory. This is where NumPy shines. NumPy has a wide variety of operators and methods that can run computations on these compact structures at the same speed as optimized C, because they are written in optimized C.\n", "\n", - "A PyTorch tensor is nearly the same thing as a numpy array, but with an additional restriction which unlocks some additional capabilities. It's the same in that it, too, is a multidimensional table of data, with all items of the same type. However, the restriction is that a tensor cannot use just any old type — it has to use a single basic numeric type for all components. As a result, a tensor is not as flexible as a genuine array of arrays, which allows jagged arrays, where the inner arrays could have different sizes. So a PyTorch tensor cannot be jagged. It is always a regularly shaped multidimensional rectangular structure.\n", + "A PyTorch tensor is nearly the same thing as a NumPy array, but with an additional restriction that unlocks some additional capabilities. It's the same in that it, too, is a multidimensional table of data, with all items of the same type. However, the restriction is that a tensor cannot use just any old type—it has to use a single basic numeric type for all components. For example, a PyTorch tensor cannot be jagged. It is always a regularly shaped multidimensional rectangular structure.\n", "\n", - "The vast majority of methods and operators supported by NumPy on these structures are also supported by PyTorch. But PyTorch tensors have additional capabilities. One major capability is that these structures can live on the GPU, in which case their computation will be optimised for the GPU, and can run much faster (given lots of values to work on). In addition, PyTorch can automatically calculate derivatives of these operations, including combinations of operations. As you'll see, it would be impossible to do deep learning in practice without this capability.\n", + "The vast majority of methods and operators supported by NumPy on these structures are also supported by PyTorch, but PyTorch tensors have additional capabilities. One major capability is that these structures can live on the GPU, in which case their computation will be optimized for the GPU and can run much faster (given lots of values to work on). In addition, PyTorch can automatically calculate derivatives of these operations, including combinations of operations. As you'll see, it would be impossible to do deep learning in practice without this capability.\n", "\n", - "> S: If you don't know what C is, do not worry as you won't need it at all. In a nutshell, it's a low-level (low-level means more similar to the language that computers use internally) language that is very fast compared to Python. To take advantage of its speed while programming in Python, try to avoid as much as possible writing loops and replace them by commands that work directly on arrays or tensors.\n", + "> S: If you don't know what C is, don't worry as you won't need it at all. In a nutshell, it's a low-level (low-level means more similar to the language that computers use internally) language that is very fast compared to Python. To take advantage of its speed while programming in Python, try to avoid as much as possible writing loops, and replace them by commands that work directly on arrays or tensors.\n", "\n", "Perhaps the most important new coding skill for a Python programmer to learn is how to effectively use the array/tensor APIs. We will be showing lots more tricks later in this book, but here's a summary of the key things you need to know for now." ] @@ -1809,7 +1814,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To create an array or tensor, pass a list (or list of lists, or list of lists of lists, etc), to `array()` or `tensor()`:" + "To create an array or tensor, pass a list (or list of lists, or list of lists of lists, etc.) to `array()` or `tensor()`:" ] }, { @@ -1869,9 +1874,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "All the operations below are shown on tensors - the syntax and results for NumPy arrays is identical.\n", + "All the operations that follow are shown on tensors, but the syntax and results for NumPy arrays is identical.\n", "\n", - "You can select a row:" + "You can select a row (note that, like lists in Python, tensors are 0-indexed so 1 refers to the second row/column):" ] }, { @@ -1898,7 +1903,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...or a column, using `:` to indicate *all of the first axis* (we sometimes refer to the dimensions of tensors/arrays as *axes*):" + "or a column, by using `:` to indicate *all of the first axis* (we sometimes refer to the dimensions of tensors/arrays as *axes*):" ] }, { @@ -1925,7 +1930,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can combine these, along with Python slice syntax (`[start:end]`, `end` being excluded)" + "You can combine these with Python slice syntax (`[start:end]` with `end` being excluded) to select part of a row or column:" ] }, { @@ -1952,7 +1957,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can use the standard operators:" + "And you can use the standard operators such as `+`, `-`, `*`, `/`:" ] }, { @@ -2007,7 +2012,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Tensors will automatically change from `int` to `float` if needed" + "And will automatically change type as needed, for example from `int` to `float`:" ] }, { @@ -2042,20 +2047,20 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Computing metrics using broadcasting" + "## Computing Metrics Using Broadcasting" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Recall that a metric is a number which is calculated from the predictions of our model, and the correct labels in our dataset, in order to tell us how good our model is. For instance, we could use either of the functions we saw in the previous section, mean squared error, or mean absolute error, and take the average of them over the whole dataset. However, neither of these are numbers that are very understandable to most people; in practice, we normally use *accuracy* as the metric for classification models.\n", + "Recall that a metric is a number that is calculated based on the predictions of our model, and the correct labels in our dataset, in order to tell us how good our model is. For instance, we could use either of the functions we saw in the previous section, mean squared error, or mean absolute error, and take the average of them over the whole dataset. However, neither of these are numbers that are very understandable to most people; in practice, we normally use *accuracy* as the metric for classification models.\n", "\n", - "As we've discussed, we will want to calculate our metric over a *validation set*. This is so that we don't inadvertently overfit -- that is, train a model to work well only on our training data. To be very precise, this is not really a risk on the pixel similarity model we're using here as a first try, since it has no trained components. But we'll use a validation set anyway to follow normal practices and to be ready for our second try later.\n", + "As we've discussed, we want to calculate our metric over a *validation set*. This is so that we don't inadvertently overfit--that is, train a model to work well only on our training data. This is not really a risk with the pixel similarity model we're using here as a first try, since it has no trained components, but we'll use a validation set anyway to follow normal practices and to be ready for our second try later.\n", "\n", - "To get a validation set we need to remove some of the data from training entirely, so it is not seen by the model at all. As it turns out, the creators of the MNIST dataset have already done this for us. Do you remember how there was a whole separate directory called \"valid\"? That's what this directory is for!\n", + "To get a validation set we need to remove some of the data from training entirely, so it is not seen by the model at all. As it turns out, the creators of the MNIST dataset have already done this for us. Do you remember how there was a whole separate directory called *valid*? That's what this directory is for!\n", "\n", - "So to start with, let's create tensors for our threes and sevens from that directory. These are the tensors we will use to calculate a metric measuring the quality of our first try model, which measures distance from an ideal image." + "So to start with, let's create tensors for our 3s and 7s from that directory. These are the tensors we will use to calculate a metric measuring the quality of our first-try model, which measures distance from an ideal image:" ] }, { @@ -2088,11 +2093,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It's good to get in the habit of checking shapes as you go. Here we see two tensors, one representing the threes validation set of 1,010 images of size 28x28, and one representing the sevens validation set of 1,028 images of size 28x28.\n", + "It's good to get in the habit of checking shapes as you go. Here we see two tensors, one representing the 3s validation set of 1,010 images of size 28\\*28, and one representing the 7s validation set of 1,028 images of size 28\\*28.\n", "\n", - "Now we ultimately want to write a function `is_3` that will decide if an arbitrary image is a 3 or a 7. It will do this by deciding which of our two \"ideal digits\" such an arbitrary image is closer to. For that we need to define a notion of distance, that is, a function which calculates the distance between two images.\n", + "We ultimately want to write a function, `is_3`, that will decide if an arbitrary image is a 3 or a 7. It will do this by deciding which of our two \"ideal digits\" this arbitrary image is closer to. For that we need to define a notion of distance--that is, a function that calculates the distance between two images.\n", "\n", - "We can do that very simply, writing a function that calculates the mean absolute error, using an experssion very similar to the one we wrote in the last section:" + "We can write a simple function that calculates the mean absolute error using an experssion very similar to the one we wrote in the last section:" ] }, { @@ -2120,11 +2125,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This is the same value we previously calculated for the distance between these two images, the ideal three `mean_3` and the arbitrary sample three `a_3`, which are both single-image tensors with a shape of `[28,28]`.\n", + "This is the same value we previously calculated for the distance between these two images, the ideal 3 `mean_3` and the arbitrary sample 3 `a_3`, which are both single-image tensors with a shape of `[28,28]`.\n", "\n", - "But in order to calculate a metric for overall accuracy, we will need to calculate the distance to the ideal three for _every_ image in the validation set. So how do we do that calculation? One could write a loop over all of the single-image tensors that are stacked within our validation set tensor, `valid_3_tens`, which has a shape `[1010,28,28]` representing 1,010 images. But there is a better way.\n", + "But in order to calculate a metric for overall accuracy, we will need to calculate the distance to the ideal 3 for _every_ image in the validation set. How do we do that calculation? We could write a loop over all of the single-image tensors that are stacked within our validation set tensor, `valid_3_tens`, which has a shape of `[1010,28,28]` representing 1,010 images. But there is a better way.\n", "\n", - "Something very interesting happens when we take this exact same distance function, designed for comparing two single images, but pass in as an argument `valid_3_tens`, the tensor which represents the threes validation set:" + "Something very interesting happens when we take this exact same distance function, designed for comparing two single images, but pass in as an argument `valid_3_tens`, the tensor that represents the 3s validation set:" ] }, { @@ -2153,13 +2158,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Instead of complaining about shapes not matching, it returned the distance for every single image, as a vector (i.e. a rank 1 tensor) of length 1,010 (the number of threes in our validation set). How did that happen?\n", + "Instead of complaining about shapes not matching, it returned the distance for every single image as a vector (i.e., a rank-1 tensor) of length 1,010 (the number of 3s in our validation set). How did that happen?\n", "\n", - "Have a look again at our function `mnist_distance`, and you'll see we have there the subtraction `(a-b)`.\n", + "Take another look at our function `mnist_distance`, and you'll see we have there the subtraction `(a-b)`.\n", "\n", - "The magic trick is that PyTorch, when it tries to perform a simple operation subtraction between two tensors of different ranks, will use *broadcasting*. Broadcasting is a feature where PyTorch will automatically expand the tensor with the smaller rank to have the same size as the one with the larger rank. Broadcasting is an important capability that makes tensor code much easier to write.\n", + "The magic trick is that PyTorch, when it tries to perform a simple subtraction operation between two tensors of different ranks, will use *broadcasting*. That is, it will automatically expand the tensor with the smaller rank to have the same size as the one with the larger rank. Broadcasting is an important capability that makes tensor code much easier to write.\n", "\n", - "After broadcasting expands the two argument tensors to have the same rank, PyTorch applies its usual logic for two tensors of the same rank, which is to perform the operation on each corresponding element of the two tensors, and returns the tensor result. For instance:" + "After broadcasting so the two argument tensors have the same rank, PyTorch applies its usual logic for two tensors of the same rank: it performs the operation on each corresponding element of the two tensors, and returns the tensor result. For instance:" ] }, { @@ -2186,7 +2191,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "So in this case, PyTorch treats `mean3`, a rank 2 tensor representing a single image, as if it was 1010 copies of the same image, and then subtracts each of those copies from each \"three\" in our validation set. What shape would you expect this tensor to have? Try to figure it out yourself before you look at the answer below:" + "So in this case, PyTorch treats `mean3`, a rank-2 tensor representing a single image, as if it were 1,010 copies of the same image, and then subtracts each of those copies from each 3 in our validation set. What shape would you expect this tensor to have? Try to figure it out yourself before you look at the answer below:" ] }, { @@ -2213,22 +2218,22 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We are calculating the difference between the \"ideal 3\" and each of 1,010 threes in the validation set, for each of `28x28` images, resulting in the shape `1010,28,28`.\n", + "We are calculating the difference between our \"ideal 3\" and each of the 1,010 3s in the validation set, for each of 28\\*28 images, resulting in the shape `[1010,28,28]`.\n", "\n", "There are a couple of important points about how broadcasting is implemented, which make it valuable not just for expressivity but also for performance:\n", "\n", - "- PyTorch doesn't *actually* copy `mean3` 1010 times. Instead, it just *pretends* as if it was a tensor of that shape, but doesn't actually allocate any additional memory\n", - "- It does the whole calculation in C (or, if you're using a GPU, in CUDA, the equivalent of C on the GPU), tens of thousands of times faster than pure Python (up to millions of times faster on a GPU!)\n", + "- PyTorch doesn't *actually* copy `mean3` 1,010 times. It *pretends* it were a tensor of that shape, but doesn't actually allocate any additional memory\n", + "- It does the whole calculation in C (or, if you're using a GPU, in CUDA, the equivalent of C on the GPU), tens of thousands of times faster than pure Python (up to millions of times faster on a GPU!).\n", "\n", - "This is true of all broadcasting and elementwise operations and functions done in PyTorch. **It's the most important technique for you to know to create efficient PyTorch code.** \n", + "This is true of all broadcasting and elementwise operations and functions done in PyTorch. *It's the most important technique for you to know to create efficient PyTorch code.*\n", "\n", - "Next in `mnist_distance` we see `abs()`. You might be able to guess now what this does when applied to a tensor. It applies the method to each individual element in the tensor, and returns a tensor of the results (that is, it applies the method \"elementwise\"). So in this case, we'll get back 1,010 absolute values.\n", + "Next in `mnist_distance` we see `abs`. You might be able to guess now what this does when applied to a tensor. It applies the method to each individual element in the tensor, and returns a tensor of the results (that is, it applies the method \"elementwise\"). So in this case, we'll get back 1,010 absolute values.\n", "\n", - "Finally, our function calls `mean((-1,-2))`. The tuple `(-1,-2)` represents a range of axes. In Python, `-1` refers to the last element, and `-2` refers to the second last. So in this case, this tells PyTorch that we want to take the mean ranging over the values indexed by the last two axes of the tensor. The last two axes are the horizontal and vertical dimensions of an image. So after taking the mean over the last two axes, we are left with just the first tensor axis, which indexes over our images, which is why our final size was `(1010)`. In other words, for every image, we averaged the intensity of all the pixels in that image.\n", + "Finally, our function calls `mean((-1,-2))`. The tuple `(-1,-2)` represents a range of axes. In Python, `-1` refers to the last element, and `-2` refers to the second-to-last. So in this case, this tells PyTorch that we want to take the mean ranging over the values indexed by the last two axes of the tensor. The last two axes are the horizontal and vertical dimensions of an image. After taking the mean over the last two axes, we are left with just the first tensor axis, which indexes over our images, which is why our final size was `(1010)`. In other words, for every image, we averaged the intensity of all the pixels in that image.\n", "\n", - "We'll be learning lots more about broadcasting throughout this book, especially in <>, and will be practising it regularly too.\n", + "We'll be learning lots more about broadcasting throughout this book, especially in <>, and will be practicing it regularly too.\n", "\n", - "We can use this `mnist_distance` to figure out whether an image is a three or not by using the following logic: if the distance between the digit in question and the ideal 3 is less than the distance to the ideal 7, then it's a 3. This function will automatically do broadcasting and be applied elementwise, just like all PyTorch functions and operators." + "We can use `mnist_distance` to figure out whether an image is a 3 or not by using the following logic: if the distance between the digit in question and the ideal 3 is less than the distance to the ideal 7, then it's a 3. This function will automatically do broadcasting and be applied elementwise, just like all PyTorch functions and operators:" ] }, { @@ -2244,9 +2249,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's test it on our example case.\n", - "\n", - "Note also that when we convert the boolean response to a float, we get a `1.0` for true and `0.0` for false:" + "Let's test it on our example case:" ] }, { @@ -2273,7 +2276,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Thanks to broadcasting, we can also test it on the full validation set of threes:" + "Note that when we convert the Boolean response to a float, we get `1.0` for `True` and `0.0` for `False`. Thanks to broadcasting, we can also test it on the full validation set of 3s:" ] }, { @@ -2300,7 +2303,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we can calculate the accuracy for each of threes and sevens, by taking the average of that function for all threes, and it's inverse for all sevens:" + "Now we can calculate the accuracy for each of the 3s and 7s by taking the average of that function for all 3s and its inverse for all 7s:" ] }, { @@ -2330,11 +2333,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This looks like a pretty good start! We're getting over 90% accuracy on both threes and sevens. And we've seen how to define a metric conveniently using broadcasting.\n", + "This looks like a pretty good start! We're getting over 90% accuracy on both 3s and 7s, and we've seen how to define a metric conveniently using broadcasting.\n", "\n", - "But let's be honest: threes and sevens are very different looking digits. And we're only classifying two out of the ten possible digits so far. So we're going to need to do better!\n", + "But let's be honest: 3s and 7s are very different-looking digits. And we're only classifying 2 out of the 10 possible digits so far. So we're going to need to do better!\n", "\n", - "To do better, perhaps it is time to try a system that does some real learning -- that is, that can automatically modify itself to improve its performance. In other words, it's time to talk about the training process, and SGD." + "To do better, perhaps it is time to try a system that does some real learning--that is, that can automatically modify itself to improve its performance. In other words, it's time to talk about the training process, and SGD." ] }, { @@ -2348,13 +2351,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Do you remember the way that Arthur Samuel described machine learning, which we quoted in <>:\n", + "Do you remember the way that Arthur Samuel described machine learning, which we quoted in <>?\n", "\n", - "> : _Suppose we arrange for some automatic means of testing the effectiveness of any current weight assignment in terms of actual performance and provide a mechanism for altering the weight assignment so as to maximize the performance. We need not go into the details of such a procedure to see that it could be made entirely automatic and to see that a machine so programmed would \"learn\" from its experience._\n", + "> : Suppose we arrange for some automatic means of testing the effectiveness of any current weight assignment in terms of actual performance and provide a mechanism for altering the weight assignment so as to maximize the performance. We need not go into the details of such a procedure to see that it could be made entirely automatic and to see that a machine so programmed would \"learn\" from its experience.\n", "\n", - "As we discussed, this is the key to allowing us to have something which can get better and better — to learn. But our pixel similarity approach does not really do this. We do not have any kind of weight assignment, or any way of improving based on testing the effectiveness of a weight assignment. In other words, we can't really improve our pixel similarity approach by modifying a set of parameters. In order to take advantage of the power of deep learning, we will first have to represent our task in the way that Arthur Samuel described it.\n", + "As we discussed, this is the key to allowing us to have a model that can get better and better—that can learn. But our pixel similarity approach does not really do this. We do not have any kind of weight assignment, or any way of improving based on testing the effectiveness of a weight assignment. In other words, we can't really improve our pixel similarity approach by modifying a set of parameters. In order to take advantage of the power of deep learning, we will first have to represent our task in the way that Arthur Samuel described it.\n", "\n", - "Instead of trying to find the similarity between an image and an \"ideal image\" we could instead look at each individual pixel, and come up with a set of weights for each pixel, such that the highest weights are associated with those pixels most likely to be black for a particular category. For instance, pixels towards the bottom right are not very likely to be activated for a seven, so they should have a low weight for a seven, but are more likely to be activated for an eight, so they should have a high weight for an eight. This can be represented as a function and set of weight values for each possible category, for instance the probability of being the number eight:\n", + "Instead of trying to find the similarity between an image and an \"ideal image,\" we could instead look at each individual pixel and come up with a set of weights for each one, such that the highest weights are associated with those pixels most likely to be black for a particular category. For instance, pixels toward the bottom right are not very likely to be activated for a 7, so they should have a low weight for a 7, but they are likely to be activated for an 8, so they should have a high weight for an 8. This can be represented as a function and set of weight values for each possible category--for instance the probability of being the number 8:\n", "\n", "```\n", "def pr_eight(x,w) = (x*w).sum()\n", @@ -2365,19 +2368,26 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Here we are assuming that X is the image, represented as a vector -- in other words, with all of the rows stacked up end to end into a single long line. And we are assuming that the weights are a vector W. If we have this function, then we just need some way to update the weights to make them a little bit better. With such an approach, we can repeat that step a number of times, making the weights better and better, until they are as good as we can make them.\n", + "Here we are assuming that `x` is the image, represented as a vector--in other words, with all of the rows stacked up end to end into a single long line. And we are assuming that the weights are a vector `w`. If we have this function, then we just need some way to update the weights to make them a little bit better. With such an approach, we can repeat that step a number of times, making the weights better and better, until they are as good as we can make them.\n", "\n", - "We want to find the specific values for the vector W which causes our function to be high for those images that are actually an eight, and low for those images which are not. Searching for the best vector W is a way to search for the best function for recognising eights. (Because we are not yet using a deep neural network, we are limited by what our function can actually do — we are going to fix that constraint later in this chapter.) \n", + "We want to find the specific values for the vector `w` that causes the result of our function to be high for those images that are actually 8s, and low for those images that are not. Searching for the best vector `w` is a way to search for the best function for recognising 8s. (Because we are not yet using a deep neural network, we are limited by what our function can actually do—we are going to fix that constraint later in this chapter.) \n", "\n", "To be more specific, here are the steps that we are going to require, to turn this function into a machine learning classifier:\n", "\n", - "1. *Initialize* the weights\n", - "1. For each image, use these weights to *predict* whether it appears to be a three or a seven\n", - "1. Based on these predictions, calculate how good the model is (its *loss*)\n", + "1. *Initialize* the weights.\n", + "1. For each image, use these weights to *predict* whether it appears to be a 3 or a 7.\n", + "1. Based on these predictions, calculate how good the model is (its *loss*).\n", "1. Calculate the *gradient*, which measures for each weight, how changing that weight would change the loss\n", - "1. *Step* (that is, change) all weights based on that calculation\n", - "1. Go back to the second step, and *repeat* the process\n", - "1. ...until you decide to *stop* the training process (for instance because the model is good enough, or you don't want to wait any longer)" + "1. *Step* (that is, change) all the weights based on that calculation.\n", + "1. Go back to the step 2, and *repeat* the process.\n", + "1. Iterate until you decide to *stop* the training process (for instance, because the model is good enough or you don't want to wait any longer)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These seven steps, illustrated in <>, are the key to the training of all deep learning models. That deep learning turns out to rely entirely on these steps is extremely surprising and counterintuitive. It's amazing that this process can solve such complex problems. But, as you'll see, it really does!" ] }, { @@ -2500,21 +2510,19 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "These seven steps, illustrated in <> are the key to the training of all deep learning models. That deep learning turns out to rely entirely on these steps is extremely surprising and counter-intuitive. It's amazing that this process can solve such complex problems. But, as you'll see, it really does!\n", + "There are many different ways to do each of these seven steps, and we will be learning about them throughout the rest of this book. These are the details that make a big difference for deep learning practitioners, but it turns out that the general approach to each one generally follows some basic principles. Here are a few guidelines:\n", "\n", - "There are many different ways to do each of these seven steps, and we will be learning about them throughout the rest of this book. These are the details which make a big difference for deep learning practitioners. But it turns out that the general approach to each one generally follows some basic principles:\n", - "\n", - "- **Initialize**:: we initialize the parameters to random values. This may sound surprising. There are certainly other choices we could make, such as initialising them to the percentage of times that that pixel is activated for that category. But since we already know that we have a routine to improve these weights, it turns out that just starting with random weights works perfectly well\n", - "- **Loss**:: This is the thing Arthur Samuel referred to: \"*testing the effectiveness of any current weight assignment in terms of actual performance*\". We need some function that will return a number that is small if the performance of the model is good (the standard approach is to treat a small loss as good, and a large loss as bad, although this is just a convention)\n", - "- **Step**:: A simple way to figure out whether a weight should be increased a bit, or decreased a bit, would be just to try it. Increase the weight by a small amount, and see if the loss goes up or down. Once you find the correct direction, you could then change that amount by a bit more, and a bit less, until you find an amount which works well. However, this is slow! As we will see, the magic of calculus allows us to directly figure out which direction, and roughly how much, to change each weight, without having to try all these small changes. The way to do this is by calculating *gradients*. This is just a performance optimisation, we would get exactly the same results by using the slower manual process as well\n", - "- **Stop**:: We have already discussed how to choose how many epochs to train a model for. This is where that decision is applied. For our digit classifier, we would keep training until the accuracy of the model started getting worse, or we ran out of time." + "- Initialize:: We initialize the parameters to random values. This may sound surprising. There are certainly other choices we could make, such as initializing them to the percentage of times that pixel is activated for that category--but since we already know that we have a routine to improve these weights, it turns out that just starting with random weights works perfectly well.\n", + "- Loss:: This is what Samuel referred to when he spoke of *testing the effectiveness of any current weight assignment in terms of actual performance*. We need some function that will return a number that is small if the performance of the model is good (the standard approach is to treat a small loss as good, and a large loss as bad, although this is just a convention).\n", + "- Step:: A simple way to figure out whether a weight should be increased a bit, or decreased a bit, would be just to try it: increase the weight by a small amount, and see if the loss goes up or down. Once you find the correct direction, you could then change that amount by a bit more, and a bit less, until you find an amount that works well. However, this is slow! As we will see, the magic of calculus allows us to directly figure out in which direction, and by roughly how much, to change each weight, without having to try all these small changes. The way to do this is by calculating *gradients*. This is just a performance optimization, we would get exactly the same results by using the slower manual process as well.\n", + "- Stop:: Once we've decided how many epochs to train the model for (a few suggestions for this were given in the earlier list), we apply that decision. This is where that decision is applied. For our digit classifier, we would keep training until the accuracy of the model started getting worse, or we ran out of time." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Before applying these steps to our image classification problem, let's illustrate what they look like in a simpler case. First we will define a very simple function, the quadratic — let's pretend that this is our loss function, and `x` is a weight parameter of the function:" + "Before applying these steps to our image classification problem, let's illustrate what they look like in a simpler case. First we will define a very simple function, the quadratic—let's pretend that this is our loss function, and `x` is a weight parameter of the function:" ] }, { @@ -2559,7 +2567,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The sequence of steps we described above starts by picking some random value for a parameter, and calculating the value of the loss:" + "The sequence of steps we described earlier starts by picking some random value for a parameter, and calculating the value of the loss:" ] }, { @@ -2589,7 +2597,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we look to see what would happen if we increased or decreased our parameter by a little bit — the *adjustment*. This is simply the slope at a particular point:" + "Now we look to see what would happen if we increased or decreased our parameter by a little bit—the *adjustment*. This is simply the slope at a particular point:" ] }, { @@ -2617,36 +2625,36 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This basic idea goes all the way back to Isaac Newton, who pointed out that we can optimise arbitrary functions in this way. Regardless of how complicated our functions become, this basic approach of gradient descent will not significantly change. The only minor changes we will see later in this book are some handy ways we can make it faster, by finding better steps." + "This basic idea goes all the way back to Isaac Newton, who pointed out that we can optimize arbitrary functions in this way. Regardless of how complicated our functions become, this basic approach of gradient descent will not significantly change. The only minor changes we will see later in this book are some handy ways we can make it faster, by finding better steps." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The gradient" + "### Calculating Gradients" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The one magic step is the bit where we calculate the *gradients*. As we mentioned, we use calculus as a performance optimization; it allows us to more quickly calculate whether our loss will go up or down when we adjust our parameters up or down. In other words, the gradients will tell us how much we have to change each weight to make our model better.\n", + "The one magic step is the bit where we calculate the gradients. As we mentioned, we use calculus as a performance optimization; it allows us to more quickly calculate whether our loss will go up or down when we adjust our parameters up or down. In other words, the gradients will tell us how much we have to change each weight to make our model better.\n", "\n", - "Perhaps you remember back to your high school calculus class: the *derivative* of a function tells you how much a change in the parameters of a function will change its result. Don't worry, lots of us forget our calculus once high school is behind us! But you will have to have some intuitive understanding of what a derivative is before you continue, so if this is all very fuzzy in your head, head over to Khan Academy and complete the lessons on basic derivatives. You won't have to know how to calculate them yourselves, you just have to know what a derivative is.\n", + "You may remember from your high school calculus class that the *derivative* of a function tells you how much a change in its parameterss will change its result. If not, don't worry, lots of us forget calculus once high school is behind us! But you will have to have some intuitive understanding of what a derivative is before you continue, so if this is all very fuzzy in your head, head over to Khan Academy and complete the [lessons on basic derivatives](https://www.khanacademy.org/math/differential-calculus/dc-diff-intro). You won't have to know how to calculate them yourselves, you just have to know what a derivative is.\n", "\n", - "The key point about a derivative is this: for any function, such as the quadratic function we saw before, we can calculate its derivative. The derivative is another function. It calculates the change, rather than the value. For instance, the derivative of the quadratic function at the value three tells us how rapidly the function changes at the value three. More specifically, you may remember from high school that gradient is defined as \"rise/run\", that is, the change in the value of the function, divided by the change in the value of the parameter. When we know how our function will change, then we know what we need to do to make it smaller. This is the key to machine learning: having a way to change the parameters of a function to make it smaller. Calculus provides us with a computational shortcut, the derivative, which lets us directly calculate the gradient of our functions." + "The key point about a derivative is this: for any function, such as the quadratic function we saw in the previous section, we can calculate its derivative. The derivative is another function. It calculates the change, rather than the value. For instance, the derivative of the quadratic function at the value 3 tells us how rapidly the function changes at the value 3. More specifically, you may recall that gradient is defined as *rise/run*, that is, the change in the value of the function, divided by the change in the value of the parameter. When we know how our function will change, then we know what we need to do to make it smaller. This is the key to machine learning: having a way to change the parameters of a function to make it smaller. Calculus provides us with a computational shortcut, the derivative, which lets us directly calculate the gradients of our functions." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "One important thing to be aware of: our function has lots of weights that we need to adjust, so when we calculate the derivative we won't get back one number, but lots of them — a gradient for every weight. But there is nothing mathematically tricky here; you can calculate the derivative with respect to one weight, and treat all the other ones as constant. Then repeat that for each weight. This is how all of the gradients are calculated, for every weight.\n", + "One important thing to be aware of is that our function has lots of weights that we need to adjust, so when we calculate the derivative we won't get back one number, but lots of them—a gradient for every weight. But there is nothing mathematically tricky here; you can calculate the derivative with respect to one weight, and treat all the other ones as constant, then repeat that for each other weight. This is how all of the gradients are calculated, for every weight.\n", "\n", - "We mentioned just now that you won't have to calculate any gradients yourselves. How can that be? Amazingly enough, PyTorch is able to automatically compute the derivative of nearly any function! What's more, it does it very fast. Most of the time, it will be at least as fast as any derivative function that you can create by hand. Let's see an example.\n", + "We mentioned just now that you won't have to calculate any gradients yourself. How can that be? Amazingly enough, PyTorch is able to automatically compute the derivative of nearly any function! What's more, it does it very fast. Most of the time, it will be at least as fast as any derivative function that you can create by hand. Let's see an example.\n", "\n", - "First, pick a tensor value which we want gradients at:" + "First, let's pick a tensor value which we want gradients at:" ] }, { @@ -2662,11 +2670,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Notice the special method `requires_grad_`? That's the magical incantation we use to tell PyTorch that we want to calculate gradients with respect to that variable at that value. It is essentially tagging the variable, so PyTorch will remember to keep track of how to compute gradients of the other, direct calculations on it which you will ask for.\n", + "Notice the special method `requires_grad_`? That's the magical incantation we use to tell PyTorch that we want to calculate gradients with respect to that variable at that value. It is essentially tagging the variable, so PyTorch will remember to keep track of how to compute gradients of the other, direct calculations on it that you will ask for.\n", "\n", - "> a: This API might throw you off if you're coming from math or physics. In those contexts the \"gradient\" of a function is just another function (i.e., its derivative), so you might expect gradient-related APIs to give you a new function. But in deep learning, \"gradients\" usually means the _value_ of a function's derivative at a particular argument value. PyTorch API also puts the focus on that argument, not the function you're actually computing the gradients of. It may feel backwards at first but it's just a different perspective.\n", + "> a: This API might throw you off if you're coming from math or physics. In those contexts the \"gradient\" of a function is just another function (i.e., its derivative), so you might expect gradient-related APIs to give you a new function. But in deep learning, \"gradients\" usually means the _value_ of a function's derivative at a particular argument value. The PyTorch API also puts the focus on the argument, not the function you're actually computing the gradients of. It may feel backwards at first, but it's just a different perspective.\n", "\n", - "Now we calculate our function with that value. Notice how PyTorch prints not just the value calculated, but also a note that it has a gradient function it'll be using to calculate our gradient when needed:" + "Now we calculate our function with that value. Notice how PyTorch prints not just the value calculated, but also a note that it has a gradient function it'll be using to calculate our gradients when needed:" ] }, { @@ -2710,7 +2718,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The \"backward\" here refers to \"back propagation\", which is the name given to the process of calculating the derivative of each layer. We'll see how this is done exactly in chapter , when we calculate the gradients of a deep neural net from scratch. This is called the \"backward pass\" of the network, as opposed to the \"forward pass\", which is where the activations are calculated. Life would probably be easier if `backward` was just called `calculate_grad`, but deep learning folks really do like to add jargon everywhere they can!" + "The \"backward\" here refers to *backpropagation*, which is the name given to the process of calculating the derivative of each layer. We'll see how this is done exactly in chapter <>, when we calculate the gradients of a deep neural net from scratch. This is called the \"backward pass\" of the network, as opposed to the \"forward pass,\" which is where the activations are calculated. Life would probably be easier if `backward` was just called `calculate_grad`, but deep learning folks really do like to add jargon everywhere they can!" ] }, { @@ -2744,9 +2752,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If you remember your high school calculus rules, the derivative of `x**2` is `2*x`, and we have `x=3`, so the gradient should be `2*3=6`, which is what PyTorch calculated for us!\n", + "If you remember your high school calculus rules, the derivative of `x**2` is `2*x`, and we have `x=3`, so the gradients should be `2*3=6`, which is what PyTorch calculated for us!\n", "\n", - "Now we'll repeat the above steps, but with a vector argument for our function:" + "Now we'll repeat the preceding steps, but with a vector argument for our function:" ] }, { @@ -2774,7 +2782,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and adding `sum()` to our function so it can take a vector (i.e. a *rank-1 tensor*), and return a scalar (i.e. a *rank-0 tensor*):" + "And we'll add `sum` to our function so it can take a vector (i.e., a rank-1 tensor), and return a scalar (i.e., a rank-0 tensor):" ] }, { @@ -2832,27 +2840,27 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The gradient only tells us the slope of our function, it doesn't actually tell us exactly how far to adjust the parameters. But it gives us some idea of how far; if the slope is very large, then that may suggest that we have more adjustments to do, whereas if the slope is very small, that may suggest that we are close to the optimal value." + "The gradients only tell us the slope of our function, they don't actually tell us exactly how far to adjust the parameters. But it gives us some idea of how far; if the slope is very large, then that may suggest that we have more adjustments to do, whereas if the slope is very small, that may suggest that we are close to the optimal value." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Stepping with a learning rate" + "### Stepping With a Learning Rate" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Deciding how to change our parameters based on the value of the gradients is an important part of the deep learning process. Nearly all approaches start with the basic idea of multiplying the gradient by some small number, called the *learning rate* (LR). The learning rate is often a number between 0.001 and 0.1, although it could be anything. Often, people select a learning rate just by trying a few, and finding which results in the best model after training (we'll show you a better approach later in this book, called the *learning rate finder*). Once you've picked a learning rate, you can adjust your parameters using this simple function:\n", + "Deciding how to change our parameters based on the values of the gradients is an important part of the deep learning process. Nearly all approaches start with the basic idea of multiplying the gradient by some small number, called the *learning rate* (LR). The learning rate is often a number between 0.001 and 0.1, although it could be anything. Often, people select a learning rate just by trying a few, and finding which results in the best model after training (we'll show you a better approach later in this book, called the *learning rate finder*). Once you've picked a learning rate, you can adjust your parameters using this simple function:\n", "\n", "```\n", "w -= gradient(w) * lr\n", "```\n", "\n", - "This is known as *stepping* your parameters, using an *optimiser step*.\n", + "This is known as *stepping* your parameters, using an *optimizer step*.\n", "\n", "If you pick a learning rate that's too low, it can mean having to do a lot of steps. <> illustrates that." ] @@ -2868,7 +2876,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Although picking a learning rate that's too high is even worse--it can actually result in the loss getting *worse* as we see in <>!" + "But picking a learning rate that's too high is even worse--it can actually result in the loss getting *worse*, as we see in <>!" ] }, { @@ -2896,23 +2904,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now let's apply all of this on an end-to-end example." + "Now let's apply all of this in an end-to-end example." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### An end-to-end SGD example" + "### An End-to-End SGD Example" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We've seen how to use gradients to find a minimum. Now it's time to look at an SGD example, and see how finding a minimum can be used to train a model to fit data better.\n", + "We've seen how to use gradients to find a minimum. Now it's time to look at an SGD example and see how finding a minimum can be used to train a model to fit data better.\n", "\n", - "Let us start with a simple, synthetic, example model. Imagine you were measuring the speed of a roller coaster as it went over the top of a hump. It would start fast, and then get slower as it went up the hill, and then would be slowest at the top, and it would then speed up again as it goes downhill. You want to build a model of how the speed changes over time. If you're measuring the speed manually every second for 20 seconds, it might look something like this:" + "Let's start with a simple, synthetic, example model. Imagine you were measuring the speed of a roller coaster as it went over the top of a hump. It would start fast, and then get slower as it went up the hill; it would be slowest at the top, and it would then speed up again as it went downhill. You want to build a model of how the speed changes over time. If you were measuring the speed manually every second for 20 seconds, it might look something like this:" ] }, { @@ -2962,9 +2970,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We've added a bit of random noise, since measuring things manually isn't precise. This means it's not that easy to answer the question: what was the roller coaster's speed? Using SGD we can try to find a function that matches our observations. We can't consider every possible function, so let's use a guess that it will be quadratic, i.e. a function of the form `a*(time**2)+(b*time)+c`.\n", + "We've added a bit of random noise, since measuring things manually isn't precise. This means it's not that easy to answer the question: what was the roller coaster's speed? Using SGD we can try to find a function that matches our observations. We can't consider every possible function, so let's use a guess that it will be quadratic; i.e., a function of the form `a*(time**2)+(b*time)+c`.\n", "\n", - "We want to distinguish clearly between the function's input (the time when we are measuring the coaster's speed) and its parameters (the values that define *which* quadratic we're trying). So let us collect the parameters in one argument and thus separate the input, `t`, and the parameters, `params`, in the function's signature: " + "We want to distinguish clearly between the function's input (the time when we are measuring the coaster's speed) and its parameters (the values that define *which* quadratic we're trying). So, let's collect the parameters in one argument and thus separate the input, `t`, and the parameters, `params`, in the function's signature: " ] }, { @@ -2982,11 +2990,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In other words, we've restricted the problem of finding the best imaginable function that fits the data, to finding the best *quadratic* function. This greatly simplifies the problem, since every quadratic function is fully defined by the three parameters `a`, `b`, and `c`. So to find the best quadratic function, we only need to find the best values for `a`, `b`, and `c`.\n", + "In other words, we've restricted the problem of finding the best imaginable function that fits the data, to finding the best *quadratic* function. This greatly simplifies the problem, since every quadratic function is fully defined by the three parameters `a`, `b`, and `c`. Thus, to find the best quadratic function, we only need to find the best values for `a`, `b`, and `c`.\n", "\n", - "If we can solve this problem for the three parameters of a quadratic function, we'll be able to apply the same approach for other, more complex functions with more parameters--such as a neural net. So let's find the parameters for `f` first, and then we'll come back and do the same thing for the MNIST dataset with a neural net.\n", + "If we can solve this problem for the three parameters of a quadratic function, we'll be able to apply the same approach for other, more complex functions with more parameters--such as a neural net. Let's find the parameters for `f` first, and then we'll come back and do the same thing for the MNIST dataset with a neural net.\n", "\n", - "We need to define first what we mean by \"best\". We define this precisely by choosing a *loss function*, which will return a value based on a prediction and a target, where lower values of the function correspond to \"better\" predictions. For continuous data, it's common to use *mean squared error*:" + "We need to define first what we mean by \"best.\" We define this precisely by choosing a *loss function*, which will return a value based on a prediction and a target, where lower values of the function correspond to \"better\" predictions. For continuous data, it's common to use *mean squared error*:" ] }, { @@ -3002,9 +3010,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now, let's work through our 7 step process.\n", - "\n", - "Step 1--*Initialize* the parameters to random values, and tell PyTorch that we want to track their gradients, using `requires_grad_`:" + "Now, let's work through our 7 step process." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 1: Initialize the parameters" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First, we initialize the parameters to random values, and tell PyTorch that we want to track their gradients, using `requires_grad_`:" ] }, { @@ -3030,7 +3050,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Step 2--Calculate the *predictions*:" + "#### Step 2: Calculate the predictions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, we calculate the predictions:" ] }, { @@ -3088,9 +3115,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This doesn't look very close--our random parameters suggest that the roller coaster will end up going backwards, since we have negative speeds!\n", - "\n", - "Step 3--Calculate the *loss*:" + "This doesn't look very close--our random parameters suggest that the roller coaster will end up going backwards, since we have negative speeds!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 3: Calculate the loss" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We calculate the loss as follows:" ] }, { @@ -3118,9 +3157,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Our goal is now to improve this. To do that, we'll need to know the gradients.\n", - "\n", - "Step 4--Calculate the *gradients*. In other words, calculate an approximation of how the parameters need to change." + "Our goal is now to improve this. To do that, we'll need to know the gradients." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 4: Calculate the gradients" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The next step is to calculate the gradients. In other words, calculate an approximation of how the parameters need to change:" ] }, { @@ -3168,7 +3219,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can use these gradients to improve our parameters. We'll need to pick a learning rate (we'll discuss how to do that in practice in the next chapter; for now we'll just pick `1e-5`(0.00001)):" + "We can use these gradients to improve our parameters. We'll need to pick a learning rate (we'll discuss how to do that in practice in the next chapter; for now we'll just use 1e-5, or 0.00001):" ] }, { @@ -3195,9 +3246,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Step 5--*Step* the weights. In other words, update the parameters based on the gradients we just calculated.\n", - "\n", - "> a: Understanding this bit depends on remembering recent history. To calculate the gradients we call `backward()` on the `loss`. But this `loss` was itself calculated by `mse()`, which in turn took `preds` as an input, which was calculated using `f` taking as an input `params`, which was the object on which we originally called `required_grads_()` -- which is the original call that now allows us to call `backward()` on `loss`. This chain of function calls represents the mathematical composition of functions, which enables PyTorch to use calculus's chain rule under the hood to calculate these gradients." + "#### Step 5: Step the weights. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we need to update the parameters based on the gradients we just calculated:" ] }, { @@ -3211,6 +3267,13 @@ "params.grad = None" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "> a: Understanding this bit depends on remembering recent history. To calculate the gradients we call `backward` on the `loss`. But this `loss` was itself calculated by `mse`, which in turn took `preds` as an input, which was calculated using `f` taking as an input `params`, which was the object on which we originally called `required_grads_`--which is the original call that now allows us to call `backward` on `loss`. This chain of function calls represents the mathematical composition of functions, which enables PyTorch to use calculus's chain rule under the hood to calculate these gradients." + ] + }, { "cell_type": "markdown", "metadata": {}, @@ -3243,7 +3306,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and take a look at the plot:" + "And take a look at the plot:" ] }, { @@ -3295,9 +3358,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...now we're ready for step 6!\n", - "\n", - "Step 6--*Repeat* the process. By looping and performing many improvements, we hope to reach a good result." + "#### Step 6: Repeat the process " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we iterate. By looping and performing many improvements, we hope to reach a good result:" ] }, { @@ -3340,7 +3408,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Loss is going down, just as we hoped! But looking only at these loss numbers disguises the fact that each iteration represents an entirely different quadratic function being tried, on the way to find the best possible quadratic function. We can see this process visually if, instead of printing out the loss function, we plot the function at every step. Then we can see how the shape is approaching the best possible quadratic function for our data:" + "The loss is going down, just as we hoped! But looking only at these loss numbers disguises the fact that each iteration represents an entirely different quadratic function being tried, on the way to finding the best possible quadratic function. We can see this process visually if, instead of printing out the loss function, we plot the function at every step. Then we can see how the shape is approaching the best possible quadratic function for our data:" ] }, { @@ -3371,14 +3439,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Step 7 is to *stop*. We just decided to stop after 10 epochs arbitrarily. In practice, we watch the training and validation losses and our metrics to decide when to stop, as we've discussed." + "#### Step 7: stop" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We just decided to stop after 10 epochs arbitrarily. In practice, we would watch the training and validation losses and our metrics to decide when to stop, as we've discussed." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Summarizing gradient descent" + "### Summarizing Gradient Descent" ] }, { @@ -3488,6 +3563,7 @@ } ], "source": [ + "#hide_input\n", "#id gradient_descent\n", "#caption The gradient descent process\n", "#alt Graph showing the steps for Gradient Descent\n", @@ -3503,23 +3579,25 @@ "source": [ "To summarize, at the beginning, the weights of our model can be random (training *from scratch*) or come from a pretrained model (*transfer learning*). In the first case, the output we will get from our inputs won't have anything to do with what we want, and even in the second case, it's very likely the pretrained model won't be very good at the specific task we are targeting. So the model will need to *learn* better weights.\n", "\n", - "To do this, we will compare the outputs the model gives us with our targets (we have labelled data, so we know what result the model should give) using a *loss function*, which returns a number that needs to be as low as possible. Our weights need to be improved. To do this, we take a few data items (such as images) that and feed them to our model. After going through our model, we compare the corresponding targets using our loss function. The score we get tells us how wrong our predictions were, and we will change the weights a little bit to make it slightly better.\n", + "We begin by comparing the outputs the model gives us with our targets (we have labeled data, so we know what result the model should give) using a *loss function*, which returns a number that we want to make as low as possible by improving our weights. To do this, we take a few data items (such as images) and feed them to our model. We compare the corresponding targets using our loss function, and the score we get tells us how wrong our predictions were. We then change the weights a little bit to make it slightly better.\n", "\n", - "To find how to change the weights to make the loss a bit better, we use calculus to calculate the *gradient*. (Actually, we let PyTorch do it for us!) Let's imagine you are lost in the mountains with your car parked at the lowest point. To find your way, you might wander in a random direction but that probably won't help much. Since you know your vehicle is at the lowest point, you would be better to go downhill. By always taking a step in the direction of the steepest downward slope, you should eventually arrive at your destination. We use the magnitude of the gradient (i.e., the steepness of the slope) to tell us how big a step to take; specifically, we multiply the gradient by a number we choose called the *learning rate* to decide on the step size." + "To find how to change the weights to make the loss a bit better, we use calculus to calculate the *gradients*. (Actually, we let PyTorch do it for us!) Let's consider an analogy. Imagine you are lost in the mountains with your car parked at the lowest point. To find your way back to it, you might wander in a random direction, but that probably wouldn't help much. Since you know your vehicle is at the lowest point, you would be better off going downhill. By always taking a step in the direction of the steepest downward slope, you should eventually arrive at your destination. We use the magnitude of the gradient (i.e., the steepness of the slope) to tell us how big a step to take; specifically, we multiply the gradient by a number we choose called the *learning rate* to decide on the step size. We then *iterate* until we have reached the lowest point, which will be our parking lot, then we can *stop*.\n", + "\n", + "All of that we just saw can be transposed directly to the MNIST dataset, except for the loss function. Let's now see how we can define a good training objective. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## MNIST loss function" + "## The MNIST Loss Function" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We already have our `x`s--that's the images themselves. We'll concatenate them all into a single tensor, and also change them from a list of matrices (a rank 3 tensor) to a list of vectors (a rank 2 tensor). We can do this using `view`, which is a PyTorch method that changes the shape of a tensor without changing its contents. `-1` is a special parameter to `view`. It means: make this axis as big as necessary to fit all the data." + "We already have our dependent variables `x`--these are the images themselves. We'll concatenate them all into a single tensor, and also change them from a list of matrices (a rank-3 tensor) to a list of vectors (a rank-2 tensor). We can do this using `view`, which is a PyTorch method that changes the shape of a tensor without changing its contents. `-1` is a special parameter to `view` that means \"make this axis as big as necessary to fit all the data\":" ] }, { @@ -3535,7 +3613,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We need a label for each. We'll use `1` for threes and `0` for sevens:" + "We need a label for each image. We'll use `1` for 3s and `0` for 7s:" ] }, { @@ -3563,7 +3641,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "A Dataset in PyTorch is required to return a tuple of `(x,y)` when indexed. Python provides a `zip` function which, when combined with `list`, provides a simple way to get this functionality:" + "A `Dataset` in PyTorch is required to return a tuple of `(x,y)` when indexed. Python provides a `zip` function which, when combined with `list`, provides a simple way to get this functionality:" ] }, { @@ -3603,7 +3681,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we need an (initially random) weight for every pixel (this is the *initialize* step in our 7-step process):" + "Now we need an (initially random) weight for every pixel (this is the *initialize* step in our seven-step process):" ] }, { @@ -3628,7 +3706,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The function `weights*pixels` won't be flexible enough--it is always equal to zero when the pixels are equal to zero (i.e. it's *intercept* is zero). You might remember from high school math that the formula for a line is `y=w*x+b`; we still need the `b`. We'll initialize it to a random number too:" + "The function `weights*pixels` won't be flexible enough--it is always equal to 0 when the pixels are equal to 0 (i.e., its *intercept* is 0). You might remember from high school math that the formula for a line is `y=w*x+b`; we still need the `b`. We'll initialize it to a random number too:" ] }, { @@ -3685,9 +3763,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Whilst we could use a python for loop to calculate the prediction for each image, that would be very slow. Because Python loops don't run on the GPU, and because Python is a slow language for loops in general, we need to represent as much of the computation in a model as possible using higher-level functions.\n", + "While we could use a Python `for` loop to calculate the prediction for each image, that would be very slow. Because Python loops don't run on the GPU, and because Python is a slow language for loops in general, we need to represent as much of the computation in a model as possible using higher-level functions.\n", "\n", - "In this case, there's an extremely convenient mathematical operation that calculates `w*x` for every row of a matrix--it's called *matrix multiplication*. <> shows what matrix multiplication looks like (diagram from Wikipedia)." + "In this case, there's an extremely convenient mathematical operation that calculates `w*x` for every row of a matrix--it's called *matrix multiplication*. <> shows what matrix multiplication looks like." ] }, { @@ -3701,7 +3779,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This image shows two matrices, `A` and `B` being multiplied together. Each item of the result, which we'll call `AB`, contains each item of its corresponding row of `A` multiplied by each item of its corresponding column of `B`, added together. For instance, row 1 column 2 (the orange dot with a red border) is calculated as $a_{1,1} * b_{1,2} + a_{1,2} * b_{2,2}$. If you need a refresher on matrix multiplication, we suggest you take a look at the great *Introduction to Matrix Multiplication* on *Khan Academy*, since this is the most important mathematical operation in deep learning.\n", + "This image shows two matrices, `A` and `B`, being multiplied together. Each item of the result, which we'll call `AB`, contains each item of its corresponding row of `A` multiplied by each item of its corresponding column of `B`, added together. For instance, row 1, column 2 (the orange dot with a red border) is calculated as $a_{1,1} * b_{1,2} + a_{1,2} * b_{2,2}$. If you need a refresher on matrix multiplication, we suggest you take a look at the [Intro to Matrix Multiplication](https://youtu.be/kT4Mp9EdVqs) on *Khan Academy*, since this is the most important mathematical operation in deep learning.\n", "\n", "In Python, matrix multiplication is represented with the `@` operator. Let's try it:" ] @@ -3745,7 +3823,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's check our accuracy. To decide if an output represents a 3 or a 7, we can just check whether it's greater than zero. So our accuracy for each item can be calculated (using broadcasting, so no loops!) with:" + "Let's check our accuracy. To decide if an output represents a 3 or a 7, we can just check whether it's greater than 0, so our accuracy for each item can be calculated (using broadcasting, so no loops!) with:" ] }, { @@ -3838,28 +3916,28 @@ "source": [ "As we've seen, we need gradients in order to improve our model using SGD, and in order to calculate gradients we need some *loss function* that represents how good our model is. That is because the gradients are a measure of how that loss function changes with small tweaks to the weights.\n", "\n", - "So we need to choose a loss function. The obvious approach would be to use accuracy, which is our metric, as our loss function as well. In this case, we would calculate our prediction for each image, collect these values to calculate an overall accuracy, and then calculate the gradients of each weight with respect to that overall accuracy.\n", + "So, we need to choose a loss function. The obvious approach would be to use accuracy, which is our metric, as our loss function as well. In this case, we would calculate our prediction for each image, collect these values to calculate an overall accuracy, and then calculate the gradients of each weight with respect to that overall accuracy.\n", "\n", - "Unfortunately, we have a significant technical problem here. The gradient of a function is its *slope*, or its steepness, which can be defined as *rise over run* -- that is, how much the value of function goes up or down, divided by how much you changed the input. We can write this in maths: `(y_new-y_old) / (x_new-x_old)`. Specifically, it is defined when x_new is very similar to x_old, meaning that their difference is very small. But accuracy only changes at all when a prediction changes from a 3 to a 7, or vice versa. So the problem is that a small change in weights from x_old to x_new isn't likely to cause any prediction to change, so `(y_new - y_old)` will be zero. In other words, the gradient is zero almost everywhere." + "Unfortunately, we have a significant technical problem here. The gradient of a function is its *slope*, or its steepness, which can be defined as *rise over run*--that is, how much the value of the function goes up or down, divided by how much we changed the input. We can write this in mathematically as: `(y_new-y_old) / (x_new-x_old)`. This gives us a good approximation of the gradient when `x_new` is very similar to `x_old`, meaning that their difference is very small. But accuracy only changes at all when a prediction changes from a 3 to a 7, or vice versa. The problem is that a small change in weights from `x_old` to `x_new` isn't likely to cause any prediction to change, so `(y_new - y_old)` will almost always be 0. In other words, the gradient is 0 almost everywhere." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "As a result, a very small change in the value of a weight will often not actually change the accuracy at all. This means it is not useful to use accuracy as a loss function. When we use accuracy as a loss function, most of the time our gradients will actually be zero, and the model will not be able to learn from that number. That is not much use at all!\n", + "A very small change in the value of a weight will often not actually change the accuracy at all. This means it is not useful to use accuracy as a loss function--if we do, most of the time our gradients will actually be 0, and the model will not be able to learn from that number.\n", "\n", - "> S: In mathematical terms, accuracy is a function that is constant almost everywhere (except at the threshold, 0.5) so its derivative is nil almost everywhere (and infinity at the threshold). This then gives gradients that are zero or infinite, so, useless to do an update of gradient descent.\n", + "> S: In mathematical terms, accuracy is a function that is constant almost everywhere (except at the threshold, 0.5), so its derivative is nil almost everywhere (and infinity at the threshold). This then gives gradients that are 0 or infinite, which are useless for updating the model.\n", "\n", - "Instead, we need a loss function which, when our weights result in slightly better predictions, gives us a slightly better loss. So what does a \"slightly better prediction\" look like, exactly? Well, in this case, it means that, if the correct answer is a 3, then the score is a little higher, or if the correct answer is a 7, then the score is a little lower.\n", + "Instead, we need a loss function which, when our weights result in slightly better predictions, gives us a slightly better loss. So what does a \"slightly better prediction\" look like, exactly? Well, in this case, it means that if the correct answer is a 3 the score is a little higher, or if the correct answer is a 7 the score is a little lower.\n", "\n", "Let's write such a function now. What form does it take?\n", "\n", - "The loss function receives not the images themseles, but the prediction from the model. So let's make one argument, `predictions`, a vector (i.e., a rank-1 tensor), indexed over the images, of values between 0 and 1, where each value is the prediction indicating how likely it is that component's image is a 3.\n", + "The loss function receives not the images themseles, but the predictions from the model. Let's make one argument, `prds`, of values between 0 and 1, where each value is the prediction that an image is a 3. It is a vector (i.e., a rank-1 tensor), indexed over the images.\n", "\n", - "The purpose of the loss function is to measure the difference between predicted values and the true values -- that is, the targets (aka, the labels). So let's make another argument `targets`, a vector (i.e., another rank-1 tensor), indexed over the images, with a value of 0 or 1 which tells whether that image actually is a 3.\n", + "The purpose of the loss function is to measure the difference between predicted values and the true values -- that is, the targets (aka labels). Let's make another argument, `trgts`, with values of 0 or 1 which tells whether an image actually is a 3 or not. It is also a vector (i.e., another rank-1 tensor), indexed over the images.\n", "\n", - "So, for instance, suppose we had three images which we knew were a 3, a 7, and a 3. And suppose our model predicted with high confidence that the first was a 3, with slight confidence that the second was a 7, and with fair confidence (and incorrectly!) that the last was a 7. This would mean our loss function would receive these values as its inputs:" + "So, for instance, suppose we had three images which we knew were a 3, a 7, and a 3. And suppose our model predicted with high confidence (`0.9`) that the first was a 3, with slight confidence (`0.4`) that the second was a 7, and with fair confidence (`0.2`), but incorrectly, that the last was a 7. This would mean our loss function would receive these values as its inputs:" ] }, { @@ -3876,7 +3954,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Here's a first try at a loss function that measures the distance between predictions and targets:" + "Here's a first try at a loss function that measures the distance between `predictions` and `targets`:" ] }, { @@ -3895,9 +3973,14 @@ "source": [ "We're using a new function, `torch.where(a,b,c)`. This is the same as running the list comprehension `[b[i] if a[i] else c[i] for i in range(len(a))]`, except it works on tensors, at C/CUDA speed. In plain English, this function will measure how distant each prediction is from 1 if it should be 1, and how distant it is from 0 if it should be 0, and then it will take the mean of all those distances.\n", "\n", - "> note: It's important to learn about PyTorch functions like this, because looping over tensors in Python performs at Python speed, not C/CUDA speed!\n", - "\n", - "Try running `help(torch.where)` now to read the docs for this function, or, better still, look it up on the PyTorch documentation site." + "> note: Read the Docs: It's important to learn about PyTorch functions like this, because looping over tensors in Python performs at Python speed, not C/CUDA speed! Try running `help(torch.where)` now to read the docs for this function, or, better still, look it up on the PyTorch documentation site." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try it on our `prds` and `trgts`:" ] }, { @@ -3924,7 +4007,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can see that this function returns a lower number when predictions are more accurate, when accurate predictions are more confident (higher absolute values), and when inaccurate predictions are less confident. In PyTorch, we always assume that a lower value of a loss function is better." + "You can see that this function returns a lower number when predictions are more accurate, when accurate predictions are more confident (higher absolute values), and when inaccurate predictions are less confident. In PyTorch, we always assume that a lower value of a loss function is better. Since we need a scalar for the final loss, `mnist_loss` takes the mean of the previous tensor:" ] }, { @@ -3951,7 +4034,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For instance, if we change our prediction for the one \"false\" target from `0.2` to `0.8` the loss will go down, indicating that this is a better prediction." + "For instance, if we change our prediction for the one \"false\" target from `0.2` to `0.8` the loss will go down, indicating that this is a better prediction:" ] }, { @@ -3978,7 +4061,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "One problem with mnist_loss as currently defined is that it assumes that predictions are always between zero and one. We need to ensure, then, that this is actually the case! As it happens, there is a function that does exactly that--it always outputs a number between zero and one and it's called sigmoid." + "One problem with `mnist_loss` as currently defined is that it assumes that predictions are always between 0 and 1. We need to ensure, then, that this is actually the case! As it happens, there is a function that does exactly that--let's take a look." ] }, { @@ -3992,7 +4075,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The function called *sigmoid* is defined by:" + "The `sigmoid` function always outputs a number between 0 and 1. It's defined as follows:" ] }, { @@ -4008,7 +4091,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Pytorch actually already defines this for us, so we don’t really need our own version. This is an important function in deep learning, since we often want to ensure values are between zero and one. This is what it looks like:" + "Pytorch defines an accelerated version for us, so we don’t really need our own. This is an important function in deep learning, since we often want to ensure values are between 0 and 1. This is what it looks like:" ] }, { @@ -4057,37 +4140,37 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we can be confident our loss function will work, even if the predictions are not between 0 and 1. All that is required is that a higher prediction corresponds to more confidence an image is a 3.\n", + "Now we can be confident our loss function will work, even if the predictions are not between 0 and 1. All that is required is that a higher prediction corresponds to higher confidence an image is a 3.\n", "\n", - "Having defined a loss function, now is a good moment to recapitulate why we did this. After all, we already had a *metric*, which was overall accuracy. So why did we define a *loss*?\n", + "Having defined a loss function, now is a good moment to recapitulate why we did this. After all, we already had a metric, which was overall accuracy. So why did we define a loss?\n", "\n", - "The key difference is that the metric is to drive human understanding and the loss is to drive automated learning. To drive automated learning, the loss must be a function which has a meaningful derivative. It can't have big flat sections, and large jumps, but instead must be reasonably smooth. This is why we designed a loss function that would respond to small changes in confidence level. This requirement on loss means that sometimes it does not really reflect exactly what we are trying to achieve, but is rather a compromise between our real goal, and a function that can be optimised using its gradient. The loss function is calculated for each item in our dataset, and then at the end of an epoch these are all averaged, and the overall mean is reported for the epoch.\n", + "The key difference is that the metric is to drive human understanding and the loss is to drive automated learning. To drive automated learning, the loss must be a function that has a meaningful derivative. It can't have big flat sections and large jumps, but instead must be reasonably smooth. This is why we designed a loss function that would respond to small changes in confidence level. This requirement means that sometimes it does not really reflect exactly what we are trying to achieve, but is rather a compromise between our real goal, and a function that can be optimized using its gradient. The loss function is calculated for each item in our dataset, and then at the end of an epoch the loss values are all averaged and the overall mean is reported for the epoch.\n", "\n", - "Metrics, on the other hand, are the numbers that we really care about. These are the things which are printed at the end of each epoch, and tell us how our model is really doing. It is important that we learn to focus on these metrics, rather than the loss, when judging the performance of a model." + "Metrics, on the other hand, are the numbers that we really care about. These are the values that are printed at the end of each epoch that tell us how our model is really doing. It is important that we learn to focus on these metrics, rather than the loss, when judging the performance of a model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### SGD and mini-batches" + "### SGD and Mini-Batches" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now that we have a loss function which is suitable to drive SGD, we can consider some of the details involved in the next phase of the learning process, which is to *step* (i.e., change or update) the weights based on the gradients. This is called an optimisation step.\n", + "Now that we have a loss function that is suitable for driving SGD, we can consider some of the details involved in the next phase of the learning process, which is to change or update the weights based on the gradients. This is called an *optimization step*.\n", "\n", - "In order to take an optimiser step we need to calculate the loss over one or more data items. How many should we use? We could calculate it for the whole dataset, and take the average, or we could calculate it for a single data item. But neither of these is ideal. Calculating it for the whole dataset would take a very long time. Calculating it for a single item would not use much information, and so it would result in a very imprecise and unstable gradient. That is, you'd be going to the trouble of updating the weights but taking into account only how that would improve the model's performance on that single item.\n", + "In order to take an optimization step we need to calculate the loss over one or more data items. How many should we use? We could calculate it for the whole dataset, and take the average, or we could calculate it for a single data item. But neither of these is ideal. Calculating it for the whole dataset would take a very long time. Calculating it for a single item would not use much information, so it would result in a very imprecise and unstable gradient. That is, you'd be going to the trouble of updating the weights, but taking into account only how that would improve the model's performance on that single item.\n", "\n", - "So instead we take a compromise between the two: we calculate the average loss for a few data items at a time. This is called a *mini-batch*. The number of data items in the mini batch is called the *batch size*. A larger batch size means that you will get a more accurate and stable estimate of your dataset's gradient on the loss function, but it will take longer, and you will get less mini-batches per epoch. Choosing a good batch size is one of the decisions you need to make as a deep learning practitioner to train your model quickly and accurately. We will talk about how to make this choice throughout this book.\n", + "So instead we take a compromise between the two: we calculate the average loss for a few data items at a time. This is called a *mini-batch*. The number of data items in the mini-batch is called the *batch size*. A larger batch size means that you will get a more accurate and stable estimate of your dataset's gradients from the loss function, but it will take longer, and you will process fewer mini-batches per epoch. Choosing a good batch size is one of the decisions you need to make as a deep learning practitioner to train your model quickly and accurately. We will talk about how to make this choice throughout this book.\n", "\n", - "Another good reason for using mini-batches rather than calculating the gradient on individual data items is that, in practice, we nearly always do our training on an accelerator such as a GPU. These accelerators only perform well if they have lots of work to do at a time. So it is helpful if we can give them lots of data items to work on at a time. Using mini-batches is one of the best ways to do this. However, if you give them too much data to work on at once, they run out of memory--making GPUs happy is also tricky!\n", + "Another good reason for using mini-batches rather than calculating the gradient on individual data items is that, in practice, we nearly always do our training on an accelerator such as a GPU. These accelerators only perform well if they have lots of work to do at a time, so it's helpful if we can give them lots of data items to work on. Using mini-batches is one of the best ways to do this. However, if you give them too much data to work on at once, they run out of memory--making GPUs happy is also tricky!\n", "\n", - "As we've seen, in the discussion of data augmentation, we get better generalisation if we can vary things during training. A simple and effective thing we can vary during training is what data items we put in each mini batch. Rather than simply enumerating our dataset in order for every epoch, instead what we normally do is to randomly shuffle it on every epoch, before we create mini batches. PyTorch and fastai provide a class that will do the shuffling and mini batch collation for you, called `DataLoader`.\n", + "As we saw in our discussion of data augmentation in <>, we get better generalization if we can vary things during training. One simple and effective thing we can vary is what data items we put in each mini-batch. Rather than simply enumerating our dataset in order for every epoch, instead what we normally do is randomly shuffle it on every epoch, before we create mini-batches. PyTorch and fastai provide a class that will do the shuffling and mini-batch collation for you, called `DataLoader`.\n", "\n", - "A `DataLoader` can take any Python collection, and turn it into an iterator over many batches, like so:" + "A `DataLoader` can take any Python collection and turn it into an iterator over many batches, like so:" ] }, { @@ -4118,7 +4201,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "For training a model, we don't just want any Python collection, but a collection containing independent and dependent variables (that is, the inputs and targets of the model). A collection that contains tuples of independent and dependent variables is known in PyTorch as a Dataset. Here's an example of an extremely simple Dataset:" + "For training a model, we don't just want any Python collection, but a collection containing independent and dependent variables (that is, the inputs and targets of the model). A collection that contains tuples of independent and dependent variables is known in PyTorch as a `Dataset`. Here's an example of an extremely simple `Dataset`:" ] }, { @@ -4146,7 +4229,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "When we pass a Dataset to a DataLoader we will get back many batches which are themselves tuples of tensors representing batches of independent and dependent variables:" + "When we pass a `Dataset` to a `DataLoader` we will get back many batches which are themselves tuples of tensors representing batches of independent and dependent variables:" ] }, { @@ -4185,7 +4268,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Putting it all together" + "## Putting It All Together" ] }, { @@ -4389,7 +4472,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and test it:" + "and test it:" ] }, { @@ -4445,7 +4528,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The gradients have changed! The reason for this is that `loss.backward` actually *adds* the gradients of `loss` to any gradients that are currently stored. So we have to set the current gradients to zero first." + "The gradients have changed! The reason for this is that `loss.backward` actually *adds* the gradients of `loss` to any gradients that are currently stored. So, we have to set the current gradients to 0 first:" ] }, { @@ -4462,14 +4545,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> note: Methods in PyTorch that end in an underscore modify their object _in-place_. For instance, `bias.zero_()` sets all elements of the tensor `bias` to zero." + "> note: Inplace Operations: Methods in PyTorch whose names end in an underscore modify their objects _in place_. For instance, `bias.zero_()` sets all elements of the tensor `bias` to 0." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Our only remaining step will be to update the weights and bias based on the gradient and learning rate. When we do so, we have to tell PyTorch not to take the gradient of this step too, otherwise things will get very confusing when we try to compute the derivative at the next batch! If we assign to the `data` attribute of a tensor then PyTorch will not take the gradient of that step. Here's our basic training loop for an epoch:" + "Our only remaining step is to update the weights and biases based on the gradient and learning rate. When we do so, we have to tell PyTorch not to take the gradient of this step too--otherwise things will get very confusing when we try to compute the derivative at the next batch! If we assign to the `data` attribute of a tensor then PyTorch will not take the gradient of that step. Here's our basic training loop for an epoch:" ] }, { @@ -4490,7 +4573,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We also want to know how we're doing, by looking at the accuracy of the validation set. To decide if an output represents a 3 or a 7, we can just check whether it's greater than zero. So our accuracy for each item can be calculated (using broadcasting, so no loops!) with:" + "We also want to check how we're doing, by looking at the accuracy of the validation set. To decide if an output represents a 3 or a 7, we can just check whether it's greater than 0. So our accuracy for each item can be calculated (using broadcasting, so no loops!) with:" ] }, { @@ -4566,7 +4649,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and then putting the batches together:" + "and then put the batches together:" ] }, { @@ -4634,7 +4717,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and do a few more:" + "Then do a few more:" ] }, { @@ -4660,23 +4743,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Looking good! We're already about at the same accuracy as our \"pixel similarity\" approach, and we've created a general purpose foundation we can build on. Our next step will be to create an object that will handle the SGD step for us. In PyTorch, it's called an *optimizer*." + "Looking good! We're already about at the same accuracy as our \"pixel similarity\" approach, and we've created a general-purpose foundation we can build on. Our next step will be to create an object that will handle the SGD step for us. In PyTorch, it's called an *optimizer*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating an optimizer" + "### Creating an Optimizer" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Because this is such a general foundation, PyTorch provides some useful classes to make it easier to implement. The first we'll use is to replace our `linear()` function with PyTorch's `nn.Linear` *module*. A \"module\" is an object of a class that inherits from the PyTorch `nn.Module` class. Objects of this class behave identically to a standard Python function, in that you can call it using parentheses, and it will return the activations of a model.\n", + "Because this is such a general foundation, PyTorch provides some useful classes to make it easier to implement. The first thing we can do is replace our `linear` function with PyTorch's `nn.Linear` module. A *module* is an object of a class that inherits from the PyTorch `nn.Module` class. Objects of this class behave identically to standard Python functions, in that you can call them using parentheses and they will return the activations of a model.\n", "\n", - "`nn.Linear` does the same thing as our `init_params` and `linear` together. It contains both the *weights* and *bias* in a single class. Here's how we replicate our model from the previous section:" + "`nn.Linear` does the same thing as our `init_params` and `linear` together. It contains both the *weights* and *biases* in a single class. Here's how we replicate our model from the previous section:" ] }, { @@ -4825,7 +4908,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The results are the same as the previous section." + "The results are the same as in the previous section:" ] }, { @@ -4875,7 +4958,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "fastai also provides `Learner.fit`, which we can use instead of `train_model`. To create a `Learner` we first need to create `DataLoaders`, by passing in our training and validation `DataLoader`s:" + "fastai also provides `Learner.fit`, which we can use instead of `train_model`. To create a `Learner` we first need to create a `DataLoaders`, by passing in our training and validation `DataLoader`s:" ] }, { @@ -4891,7 +4974,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To create a `Learner` without using an application (such as `cnn_learner`) we need to pass in all the information that we've created in this chapter: the `DataLoaders`, the model, the optimization function (which will be passed the parameters), the loss function, and optionally any metrics to print:" + "To create a `Learner` without using an application (such as `cnn_learner`) we need to pass in all the elements that we've created in this chapter: the `DataLoaders`, the model, the optimization function (which will be passed the parameters), the loss function, and optionally any metrics to print:" ] }, { @@ -5028,14 +5111,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Adding a non-linearity" + "## Adding a Nonlinearity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "So far we have a general procedure for optimising the parameters of a function, and we have tried it out on a very boring function: a simple linear classifier. A linear classifier is very constrained in terms of what it can do. To make it a bit more complex (and able to handle more tasks), we need to add a non-linearity between two linear classifiers, and this is what gives us a neural network.\n", + "So far we have a general procedure for optimizing the parameters of a function, and we have tried it out on a very boring function: a simple linear classifier. A linear classifier is very constrained in terms of what it can do. To make it a bit more complex (and able to handle more tasks), we need to add something nonlinear between two linear classifiers--this is what gives us a neural network.\n", "\n", "Here is the entire definition of a basic neural network:" ] @@ -5057,9 +5140,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "That's it! All we have in `simple_net` is two linear classifiers with a max function between them.\n", + "That's it! All we have in `simple_net` is two linear classifiers with a `max` function between them.\n", "\n", - "Here, `w1` and `w2` are weight tensors, and `b1` and `b2` are bias tensors; that is, parameters that are initially randomly initialised, just like we did in the previous section." + "Here, `w1` and `w2` are weight tensors, and `b1` and `b2` are bias tensors; that is, parameters that are initially randomly initialized, just like we did in the previous section:" ] }, { @@ -5080,7 +5163,7 @@ "source": [ "The key point about this is that `w1` has 30 output activations (which means that `w2` must have 30 input activations, so they match). That means that the first layer can construct 30 different features, each representing some different mix of pixels. You can change that `30` to anything you like, to make the model more or less complex.\n", "\n", - "That little function `res.max(tensor(0.0))` is called a *rectified linear unit*, also known as *ReLU*. We think we can all agree that *rectified linear unit* sounds pretty fancy and complicated... But actually, there's nothing more to it than `res.max(tensor(0.0))`, in other words: replace every negative number with a zero. This tiny function is also available in PyTorch as `F.relu`:" + "That little function `res.max(tensor(0.0))` is called a *rectified linear unit*, also known as *ReLU*. We think we can all agree that *rectified linear unit* sounds pretty fancy and complicated... But actually, there's nothing more to it than `res.max(tensor(0.0))`--in other words, replace every negative number with a zero. This tiny function is also available in PyTorch as `F.relu`:" ] }, { @@ -5109,32 +5192,32 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> J: There is an enormous amount of jargon in deep learning, such as: _rectified linear unit_. The vast vast majority of this jargon is no more complicated than can be implemented in a short line of code and Python, as we saw in this example. The reality is that for academics to get their papers published they need to make them sound as impressive and sophisticated as possible. One of the ways that they do that is to introduce jargon. Unfortunately, this has the result that the field ends up becoming far more intimidating and difficult to get into than it should be. You do have to learn the jargon, because otherwise papers and tutorials are not going to mean much to you. But that doesn't mean you have to find the jargon intimidating. Just remember, when you come across a word or phrase that you haven't seen before, it will almost certainly turn out that it is a very simple concept that it is referring to." + "> J: There is an enormous amount of jargon in deep learning, including terms like _rectified linear unit_. The vast vast majority of this jargon is no more complicated than can be implemented in a short line of code, as we saw in this example. The reality is that for academics to get their papers published they need to make them sound as impressive and sophisticated as possible. One of the ways that they do that is to introduce jargon. Unfortunately, this has the result that the field ends up becoming far more intimidating and difficult to get into than it should be. You do have to learn the jargon, because otherwise papers and tutorials are not going to mean much to you. But that doesn't mean you have to find the jargon intimidating. Just remember, when you come across a word or phrase that you haven't seen before, it will almost certainly turn to be referring to a very simple concept." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The basic idea is that by using more linear layers, we can have our model do more computation, and therefore model more complex functions. But there's no point just putting one linear layout directly after another one, because when we multiply things together and then add them up multiple times, that can be replaced by multiplying different things together and adding them up just once! That is to say, a series of any number of linear layers in a row can be replaced with a single linear layer with a different set of parameters.\n", + "The basic idea is that by using more linear layers, we can have our model do more computation, and therefore model more complex functions. But there's no point just putting one linear layout directly after another one, because when we multiply things together and then add them up multiple times, that could be replaced by multiplying different things together and adding them up just once! That is to say, a series of any number of linear layers in a row can be replaced with a single linear layer with a different set of parameters.\n", "\n", - "But if we put a non-linear function between them, such as max, then this is no longer true. Now, each linear layer is actually somewhat decoupled from the other ones, and can do its own useful work. The max function is particularly interesting, because it operates as a simple \"if\" statement. For any arbitrarily wiggly function, we can approximate it as a bunch of lines joined together; to make it more close to the wiggly function, we just have to use shorter lines." + "But if we put a nonlinear function between them, such as `max`, then this is no longer true. Now each linear layer is actually somewhat decoupled from the other ones, and can do its own useful work. The `max` function is particularly interesting, because it operates as a simple `if` statement." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> S: Mathematically, we say the composition of two linear functions is another linear function. So we can stack as many linear classifiers on top of each other, without non-linear functions between them, it will just be the same as one linear classifier." + "> S: Mathematically, we say the composition of two linear functions is another linear function. So, we can stack as many linear classifiers as we want on top of each other, and without nonlinear functions between them, it will just be the same as one linear classifier." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Amazingly enough, it can be mathematically proven that this little function can solve any computable problem to an arbitrarily high level of accuracy, if you can find the right parameters for `w1` and `w2`, and if you make these matrices big enough. This is known as the *universal approximation theorem* . The three lines of code that we have here are known as *layers*. The first and third are known as *linear layers*, and the second line of code is known variously as a *nonlinearity*, or *activation function*.\n", + "Amazingly enough, it can be mathematically proven that this little function can solve any computable problem to an arbitrarily high level of accuracy, if you can find the right parameters for `w1` and `w2` and if you make these matrices big enough. For any arbitrarily wiggly function, we can approximate it as a bunch of lines joined together; to make it closer to the wiggly function, we just have to use shorter lines. This is known as the *universal approximation theorem*. The three lines of code that we have here are known as *layers*. The first and third are known as *linear layers*, and the second line of code is known variously as a *nonlinearity*, or *activation function*.\n", "\n", - "Just like the previous section, we can replace this code with something a bit simpler, by taking advantage of PyTorch:" + "Just like in the previous section, we can replace this code with something a bit simpler, by taking advantage of PyTorch:" ] }, { @@ -5154,11 +5237,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "`nn.Sequential` creates a module which will call each of the listed layers or functions in turn.\n", + "`nn.Sequential` creates a module that will call each of the listed layers or functions in turn.\n", "\n", - "`F.relu` is a function, not a PyTorch module. `nn.ReLU` is a PyTorch module that does exactly the same thing. Most functions that can appear in a model also have identical forms that are modules. Generally, it's just a case of replacing `F` with `nn`, and changing the capitalization. When using `nn.Sequential` PyTorch requires us to use the module version. Since modules are classes, we have to instantiate them, which is why you see `nn.ReLU()` above. Because `nn.Sequential` is a module, we can get its parameters--which will return a list of all the parameters of all modules it contains.\n", + "`nn.ReLU` is a PyTorch module that does exactly the same thing as the `F.relu` function. Most functions that can appear in a model also have identical forms that are modules. Generally, it's just a case of replacing `F` with `nn` and changing the capitalization. When using `nn.Sequential`, PyTorch requires us to use the module version. Since modules are classes, we have to instantiate them, which is why you see `nn.ReLU()` in this example. \n", "\n", - "Let's try it out! For deeper models, we may need to use a lower learning rate and a few more epochs." + "Because `nn.Sequential` is a module, we can get its parameters, which will return a list of all the parameters of all the modules it contains. Let's try it out! As this is a deeper model, we'll use a lower learning rate and a few more epochs." ] }, { @@ -5519,7 +5602,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and we can view the final accuracy:" + "And we can view the final accuracy:" ] }, { @@ -5551,15 +5634,27 @@ "1. A function that can solve any problem to any level of accuracy (the neural network) given the correct set of parameters\n", "1. A way to find the best set of parameters for any function (stochastic gradient descent)\n", "\n", - "This is why deep learning can do things which seem rather magical. Believing that this combination of simple techniques can really solve any problem here is one of the biggest steps that we find many students have to take. It seems too good to be true. It seems like things should be more difficult and complicated than this. Our recommendation: try it out! We will take our own recommendation and try this model on the MNIST dataset. Since we are doing everything from scratch ourselves (except for calculating the gradients) you know that there is no special magic hiding behind the scenes…\n", - "\n", - "There is no need to stop at just two linear layers. We can add as many as we want, as long as we add a nonlinearity between each pair of linear layers. As we will learn, however, the deeper the model gets, the harder it is to optimise the parameters in practice. Later in this book we will learn about some simple but brilliantly effective techniques for training deeper models.\n", + "This is why deep learning can do things which seem rather magical such fantastic things. Believing that this combination of simple techniques can really solve any problem is one of the biggest steps that we find many students have to take. It seems too good to be true--surely things should be more difficult and complicated than this? Our recommendation: try it out! We just tried it on the MNIST dataset and you have seen the results. And since we are doing everything from scratch ourselves (except for calculating the gradients) you know that there is no special magic hiding behind the scenes." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Going Deeper" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is no need to stop at just two linear layers. We can add as many as we want, as long as we add a nonlinearity between each pair of linear layers. As you will learn, however, the deeper the model gets, the harder it is to optimize the parameters in practice. Later in this book you will learn about some simple but brilliantly effective techniques for training deeper models.\n", "\n", - "We already know that a single nonlinearity with two linear layers is enough to approximate any function. So why would we use deeper models? The reason is performance. With a deeper model (that is, one with more layers) we do not need to use as many parameters; it turns out that we can use smaller matrices, with more layers, and get better results than we would get with larger matrices, and few layers.\n", + "We already know that a single nonlinearity with two linear layers is enough to approximate any function. So why would we use deeper models? The reason is performance. With a deeper model (that is, one with more layers) we do not need to use as many parameters; it turns out that we can use smaller matrices with more layers, and get better results than we would get with larger matrices, and few layers.\n", "\n", - "That means that we can train them more quickly, and our model will take up less memory. In the 1990s researchers were so focused on the universal approximation theorem that very few were experimenting with more than one nonlinearity. This theoretical but not practical foundation held back the field for years. Some researchers, however, did experiment with deep models, and eventually were able to show that these models could perform much better in practice. Eventually, theoretical results were developed which showed why this happens. Today, it is extremely unusual to find anybody using a neural network with just one nonlinearity.\n", + "That means that we can train the model more quickly, and it will take up less memory. In the 1990s researchers were so focused on the universal approximation theorem that very few were experimenting with more than one nonlinearity. This theoretical but not practical foundation held back the field for years. Some researchers, however, did experiment with deep models, and eventually were able to show that these models could perform much better in practice. Eventually, theoretical results were developed which showed why this happens. Today, it is extremely unusual to find anybody using a neural network with just one nonlinearity.\n", "\n", - "Here what happens when we train 18 layer model using the same approach we saw in <>:" + "Here what happens when we train an 18-layer model using the same approach we saw in <>:" ] }, { @@ -5617,33 +5712,33 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Jargon recap" + "## Jargon Recap" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Congratulations: you now know how to create and train a deep neural network from scratch! There has been quite a few steps to get to this point, but you might be surprised at how simple it really has ended up.\n", + "Congratulations: you now know how to create and train a deep neural network from scratch! We've gone through quite a few steps to get to this point, but you might be surprised at how simple it really is.\n", "\n", - "Now that we are at this point, it is a good opportunity to define, and review, some jargon and concepts.\n", + "Now that we are at this point, it is a good opportunity to define, and review, some jargon and key concepts.\n", "\n", - "The neural network contains a lot of numbers. But those numbers only have one of two types: numbers that are calculated, and the parameters that these are calculated from. This gives us the two most important pieces of jargon to learn:\n", + "A neural network contains a lot of numbers, but they are only of two types: numbers that are calculated, and the parameters that these numbers are calculated from. This gives us the two most important pieces of jargon to learn:\n", "\n", - "- *activations*: numbers that are calculated (both by linear and non-linear layers)\n", - "- *parameters*: numbers that are randomly initialised, and optimised (that is, the numbers that define the model)\n", + "- Activations:: Numbers that are calculated (both by linear and nonlinear layers)\n", + "- Parameters:: Numbers that are randomly initialized, and optimized (that is, the numbers that define the model)\n", "\n", "We will often talk in this book about activations and parameters. Remember that they have very specific meanings. They are numbers. They are not abstract concepts, but they are actual specific numbers that are in your model. Part of becoming a good deep learning practitioner is getting used to the idea of actually looking at your activations and parameters, and plotting them and testing whether they are behaving correctly.\n", "\n", - "Our activations and parameters are all contained in tensors. These are simply regularly shaped arrays. For example, a matrix. Matrices have rows and columns; we call these the *axes* or *dimensions*. The number of dimensions of a tensor is its *rank*. There are some special tensors:\n", + "Our activations and parameters are all contained in *tensors*. These are simply regularly shaped arrays--for example, a matrix. Matrices have rows and columns; we call these the *axes* or *dimensions*. The number of dimensions of a tensor is its *rank*. There are some special tensors:\n", "\n", - "- rank zero: scalar\n", - "- rank one: vector\n", - "- rank two: matrix\n", + "- Rank zero: scalar\n", + "- Rank one: vector\n", + "- Rank two: matrix\n", "\n", - "A neural network contains a number of layers. Each layer is either linear or nonlinear. We generally alternate between these two kinds of layers in a neural network. Sometimes people refer to both a linear layer and its subsequent nonlinearity together as a single *layer*. Yes, this is confusing. Sometimes a nonlinearity is referred to as an activation function.\n", + "A neural network contains a number of layers. Each layer is either *linear* or *nonlinear*. We generally alternate between these two kinds of layers in a neural network. Sometimes people refer to both a linear layer and its subsequent nonlinearity together as a single layer. Yes, this is confusing. Sometimes a nonlinearity is referred to as an *activation function*.\n", "\n", - "<> contains the concepts related to SGD.\n", + "<> summarizes the key concepts related to SGD.\n", "\n", "```asciidoc\n", "[[dljargon1]]\n", @@ -5651,14 +5746,14 @@ "[options=\"header\"]\n", "|=====\n", "| Term | Meaning\n", - "|**ReLU** | Function that returns 0 for negative numbers and doesn't change positive numbers\n", - "|**mini-batch** | A few inputs and labels gathered together in two arrays. A gradient descent step is updated on this batch (rather than a whole epoch).\n", - "|**forward pass** | Applying the model to some input and computing the predictions\n", - "|**loss** | A value that represents how well (or badly) our model is doing\n", - "|**gradient** | The derivative of the loss with respect to some parameter of the model\n", - "|**backard pass** | Computing the gradients of the loss with respect to all model parameters\n", - "|**gradient descent** | Taking a step in the directions opposite to the gradients to make the model parameters a little bit better\n", - "|**learning rate** | The size of the step we take when applying SGD to update the parameters of the model\n", + "|ReLU | Function that returns 0 for negative numbers and doesn't change positive numbers.\n", + "|Mini-batch | A smll group of inputs and labels gathered together in two arrays. A gradient descent step is updated on this batch (rather than a whole epoch).\n", + "|Forward pass | Applying the model to some input and computing the predictions.\n", + "|Loss | A value that represents how well (or badly) our model is doing.\n", + "|Gradient | The derivative of the loss with respect to some parameter of the model.\n", + "|Backard pass | Computing the gradients of the loss with respect to all model parameters.\n", + "|Gradient descent | Taking a step in the directions opposite to the gradients to make the model parameters a little bit better.\n", + "|Learning rate | The size of the step we take when applying SGD to update the parameters of the model.\n", "|=====\n", "```" ] @@ -5667,14 +5762,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "#### _Choose Your Own Adventure_ reminder" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Did you choose to skip over chapters 2 & 3, in your excitement to peek under the hood? Well, here's your reminder to head back to chapter 2 now, because you'll be needing to know that stuff very soon!" + "> note: _Choose Your Own Adventure_ Reminder: Did you choose to skip over chapters 2 & 3, in your excitement to peek under the hood? Well, here's your reminder to head back to chapter 2 now, because you'll be needing to know that stuff very soon!" ] }, { @@ -5688,20 +5776,20 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "1. How is a greyscale image represented on a computer? How about a color image?\n", + "1. How is a grayscale image represented on a computer? How about a color image?\n", "1. How are the files and folders in the `MNIST_SAMPLE` dataset structured? Why?\n", "1. Explain how the \"pixel similarity\" approach to classifying digits works.\n", "1. What is a list comprehension? Create one now that selects odd numbers from a list and doubles them.\n", - "1. What is a \"rank 3 tensor\"?\n", + "1. What is a \"rank-3 tensor\"?\n", "1. What is the difference between tensor rank and shape? How do you get the rank from the shape?\n", "1. What are RMSE and L1 norm?\n", "1. How can you apply a calculation on thousands of numbers at once, many thousands of times faster than a Python loop?\n", - "1. Create a 3x3 tensor or array containing the numbers from 1 to 9. Double it. Select the bottom right 4 numbers.\n", + "1. Create a 3\\*3 tensor or array containing the numbers from 1 to 9. Double it. Select the bottom-right four numbers.\n", "1. What is broadcasting?\n", "1. Are metrics generally calculated using the training set, or the validation set? Why?\n", "1. What is SGD?\n", - "1. Why does SGD use mini batches?\n", - "1. What are the 7 steps in SGD for machine learning?\n", + "1. Why does SGD use mini-batches?\n", + "1. What are the seven steps in SGD for machine learning?\n", "1. How do we initialize the weights in a model?\n", "1. What is \"loss\"?\n", "1. Why can't we always use a high learning rate?\n", @@ -5709,18 +5797,18 @@ "1. Do you need to know how to calculate gradients yourself?\n", "1. Why can't we use accuracy as a loss function?\n", "1. Draw the sigmoid function. What is special about its shape?\n", - "1. What is the difference between loss and metric?\n", + "1. What is the difference between a loss function and a metric?\n", "1. What is the function to calculate new weights using a learning rate?\n", "1. What does the `DataLoader` class do?\n", - "1. Write pseudo-code showing the basic steps taken each epoch for SGD.\n", - "1. Create a function which, if passed two arguments `[1,2,3,4]` and `'abcd'`, returns `[(1, 'a'), (2, 'b'), (3, 'c'), (4, 'd')]`. What is special about that output data structure?\n", + "1. Write pseudocode showing the basic steps taken in each epoch for SGD.\n", + "1. Create a function that, if passed two arguments `[1,2,3,4]` and `'abcd'`, returns `[(1, 'a'), (2, 'b'), (3, 'c'), (4, 'd')]`. What is special about that output data structure?\n", "1. What does `view` do in PyTorch?\n", "1. What are the \"bias\" parameters in a neural network? Why do we need them?\n", - "1. What does the `@` operator do in python?\n", + "1. What does the `@` operator do in Python?\n", "1. What does the `backward` method do?\n", "1. Why do we have to zero the gradients?\n", "1. What information do we have to pass to `Learner`?\n", - "1. Show python or pseudo-code for the basic steps of a training loop.\n", + "1. Show Python or pseudocode for the basic steps of a training loop.\n", "1. What is \"ReLU\"? Draw a plot of it for values from `-2` to `+2`.\n", "1. What is an \"activation function\"?\n", "1. What's the difference between `F.relu` and `nn.ReLU`?\n", @@ -5731,7 +5819,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { @@ -5739,7 +5827,7 @@ "metadata": {}, "source": [ "1. Create your own implementation of `Learner` from scratch, based on the training loop shown in this chapter.\n", - "1. Complete all the steps in this chapter using the full MNIST datasets (that is, for all digits, not just threes and sevens). This is a significant project and will take you quite a bit of time to complete! You'll need to do some of your own research to figure out how to overcome some obstacles you'll meet on the way." + "1. Complete all the steps in this chapter using the full MNIST datasets (that is, for all digits, not just 3s and 7s). This is a significant project and will take you quite a bit of time to complete! You'll need to do some of your own research to figure out how to overcome some obstacles you'll meet on the way." ] }, { diff --git a/05_pet_breeds.ipynb b/05_pet_breeds.ipynb index 913cf6bd2..c37a3ad2a 100644 --- a/05_pet_breeds.ipynb +++ b/05_pet_breeds.ipynb @@ -21,41 +21,41 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Image classification" + "# Image Classification" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now that we understand what deep learning is, what it's for, and how to create and deploy a model, it's time for us to go deeper! In an ideal world deep learning practitioners wouldn't have to know every detail of how things work under the hood… But as yet, we don't live in an ideal world. The truth is, to make your model really work, and work reliably, there's a lot of details you have to get right. And a lot of details that you have to check. This process requires being able to look inside your neural network as it trains, and as it makes predictions, find possible problems, and know how to fix them.\n", + "Now that you understand what deep learning is, what it's for, and how to create and deploy a model, it's time for us to go deeper! In an ideal world deep learning practitioners wouldn't have to know every detail of how things work under the hood… But as yet, we don't live in an ideal world. The truth is, to make your model really work, and work reliably, there are a lot of details you have to get right, and a lot of details that you have to check. This process requires being able to look inside your neural network as it trains, and as it makes predictions, find possible problems, and know how to fix them.\n", "\n", - "So, from here on in the book we are going to do a deep dive into the mechanics of deep learning. What is the architecture of a computer vision model, an NLP model, a tabular model, and so on. How do you create an architecture which matches the needs of your particular domain? How do you get the best possible results from the training process? How do you make things faster? What do you have to change as your datasets change?\n", + "So, from here on in the book we are going to do a deep dive into the mechanics of deep learning. What is the architecture of a computer vision model, an NLP model, a tabular model, and so on? How do you create an architecture that matches the needs of your particular domain? How do you get the best possible results from the training process? How do you make things faster? What do you have to change as your datasets change?\n", "\n", "We will start by repeating the same basic applications that we looked at in the first chapter, but we are going to do two things:\n", "\n", - "- make them better;\n", - "- apply them to a wider variety of types of data.\n", + "- Make them better.\n", + "- Apply them to a wider variety of types of data.\n", "\n", - "In order to do these two things, we will have to learn all of the pieces of the deep learning puzzle. This includes: different types of layers, regularisation methods, optimisers, putting layers together into architectures, labelling techniques, and much more. We are not just going to dump all of these things out, but we will introduce them progressively as needed, to solve an actual problem related to the project we are working on." + "In order to do these two things, we will have to learn all of the pieces of the deep learning puzzle. This includes different types of layers, regularization methods, optimizers, how to put layers together into architectures, labeling techniques, and much more. We are not just going to dump all of these things on you, though; we will introduce them progressively as needed, to solve actual problems related to the projects we are working on." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## From dogs and cats, to pet breeds" + "## From Dogs and Cats to Pet Breeds" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In our very first model we learnt how to classify dogs versus cats. Just a few years ago this was considered a very challenging task. But today, it is far too easy! We will not be able to show you the nuances of training models with this problem, because we get the nearly perfect result without worrying about any of the details. But it turns out that the same dataset also allows us to work on a much more challenging problem: figuring out what breed of pet is shown in each image.\n", + "In our very first model we learned how to classify dogs versus cats. Just a few years ago this was considered a very challenging task--but today, it's far too easy! We will not be able to show you the nuances of training models with this problem, because we get a nearly perfect result without worrying about any of the details. But it turns out that the same dataset also allows us to work on a much more challenging problem: figuring out what breed of pet is shown in each image.\n", "\n", - "In the first chapter we presented the applications as already solved problems. But this is not how things work in real life. We start with some dataset which we know nothing about. We have to understand how it is put together, how to extract the data we need from it, and what that data looks like. For the rest of this book we will be showing you how to solve these problems in practice, including all of these intermediate steps necessary to understand the data that we are working with and test our modelling as we go.\n", + "In <> we presented the applications as already-solved problems. But this is not how things work in real life. We start with some dataset that we know nothing about. We then have to figure out how it is put together, how to extract the data we need from it, and what that data looks like. For the rest of this book we will be showing you how to solve these problems in practice, including all of the intermediate steps necessary to understand the data that you are working with and test your modeling as you go.\n", "\n", - "We have already downloaded the pets dataset. We can get a path to this dataset using the same code we saw in <>:" + "We already downloaded the Pet dataset, and we can get a path to this dataset using the same code as in <>:" ] }, { @@ -74,12 +74,12 @@ "source": [ "Now if we are going to understand how to extract the breed of each pet from each image we're going to need to understand how this data is laid out. Such details of data layout are a vital piece of the deep learning puzzle. Data is usually provided in one of these two ways:\n", "\n", - "- Individual files representing items of data, such as text documents or images, possibly organised into folders or with filenames representing information about those items, or\n", - "- A table of data, such as in CSV format, where each row is an item, each row which may include filenames providing a connection between the data in the table and data in other formats such as text documents and images.\n", + "- Individual files representing items of data, such as text documents or images, possibly organized into folders or with filenames representing information about those items\n", + "- A table of data, such as in CSV format, where each row is an item which may include filenames providing a connection between the data in the table and data in other formats, such as text documents and images\n", "\n", - "There are exceptions to these rules, particularly in domains such as genomics, where there can be binary database formats or even network streams, but overall the vast majority of the datasets you work with use some combination of the above two formats.\n", + "There are exceptions to these rules--particularly in domains such as genomics, where there can be binary database formats or even network streams--but overall the vast majority of the datasets you'll work with will use some combination of these two formats.\n", "\n", - "To see what is in our dataset we can use the ls method:" + "To see what is in our dataset we can use the `ls` method:" ] }, { @@ -116,7 +116,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see that this dataset provides us with \"images\" and \"annotations\" directories. The website for this dataset tells us that the annotations directory contains information about where the pets are rather than what they are. In this chapter, we will be doing classification, not localization, which is to say that we care about what the pets are not where they are. Therefore we will ignore the annotations directory for now. So let's have a look inside the images directory:" + "We can see that this dataset provides us with *images* and *annotations* directories. The [website](https://www.robots.ox.ac.uk/~vgg/data/pets/) for the dataset tells us that the *annotations* directory contains information about where the pets are rather than what they are. In this chapter, we will be doing classification, not localization, which is to say that we care about what the pets are, not where they are. Therefore, we will ignore the *annotations* directory for now. So, let's have a look inside the *images* directory:" ] }, { @@ -143,9 +143,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Most functions and methods in fastai which return a collection use a class called `L`. `L` can be thought of as an enhanced version of the ordinary Python `list` type, with added conveniences for common operations. For instance, when we display an object of this class in a notebook it appears in the format you see above. The first thing that is shown is the number of items in the collection, prefixed with a `#`. You'll also see in the above output that the list is suffixed with a \"…\". This means that only the first few items are displayed — which is a good thing, because we would not want more than 7000 filenames on our screen!\n", + "Most functions and methods in fastai that return a collection use a class called `L`. `L` can be thought of as an enhanced version of the ordinary Python `list` type, with added conveniences for common operations. For instance, when we display an object of this class in a notebook it appears in the format shown there. The first thing that is shown is the number of items in the collection, prefixed with a `#`. You'll also see in the preceding output that the list is suffixed with an ellipsis. This means that only the first few items are displayed—which is a good thing, because we would not want more than 7,000 filenames on our screen!\n", "\n", - "By examining these filenames, we see how they appear to be structured. Each file name contains the pet breed, and then an _ character, a number, and finally the file extension. We need to create a piece of code that extracts the breed from a single `Path`. Jupyter notebook makes this easy, because we can gradually build up something that works, and then use it for the entire dataset. We do have to be careful to not make too many assumptions at this point. For instance, if you look carefully you may notice that some of the pet breeds contain multiple words, so we cannot simply break at the first `_` character that we find. To allow us to test our code, let's pick out one of these filenames:" + "By examining these filenames, we can see how they appear to be structured. Each filename contains the pet breed, and then an underscore (`_`), a number, and finally the file extension. We need to create a piece of code that extracts the breed from a single `Path`. Jupyter notebooks make this easy, because we can gradually build up something that works, and then use it for the entire dataset. We do have to be careful to not make too many assumptions at this point. For instance, if you look carefully you may notice that some of the pet breeds contain multiple words, so we cannot simply break at the first `_` character that we find. To allow us to test our code, let's pick out one of these filenames:" ] }, { @@ -163,11 +163,11 @@ "source": [ "The most powerful and flexible way to extract information from strings like this is to use a *regular expression*, also known as a *regex*. A regular expression is a special string, written in the regular expression language, which specifies a general rule for deciding if another string passes a test (i.e., \"matches\" the regular expression), and also possibly for plucking a particular part or parts out of that other string. \n", "\n", - "In this case, we need a regular expression that extracts the pet breed from the file name.\n", + "In this case, we need a regular expression that extracts the pet breed from the filename.\n", "\n", - "We do not have the space to give you a complete regular expression tutorial here, particularly because there are so many excellent ones online. And we know that many of you will already be familiar with this wonderful tool. If you're not, that is totally fine — this is a great opportunity for you to rectify that! We find that regular expressions are one of the most useful tools in our programming toolkit, and many of our students tell us that it is one of the things they are most excited to learn about. So head over to Google and search for *regular expressions tutorial* now, and then come back here after you've had a good look around. The book website also provides a list of our favorites.\n", + "We do not have the space to give you a complete regular expression tutorial here,but there are many excellent ones online and we know that many of you will already be familiar with this wonderful tool. If you're not, that is totally fine—this is a great opportunity for you to rectify that! We find that regular expressions are one of the most useful tools in our programming toolkit, and many of our students tell us that this is one of the things they are most excited to learn about. So head over to Google and search for \"regular expressions tutorial\" now, and then come back here after you've had a good look around. The [book's website](https://book.fast.ai/) also provides a list of our favorites.\n", "\n", - "> a: Not only are regular expressions dead handy, they also have interesting roots. They are \"regular\" because they were originally examples of a \"regular\" language, the lowest rung within the \"Chomsky hierarchy\", a grammar classification due to the same linguist Noam Chomsky who wrote _Syntactic Structures_, the pioneering work searching for the formal grammar underlying human language. This is one of the charms of computing: it may be that the hammer you reach for every day in fact came from a space ship.\n", + "> a: Not only are regular expressions dead handy, but they also have interesting roots. They are \"regular\" because they were originally examples of a \"regular\" language, the lowest rung within the Chomsky hierarchy, a grammar classification developed by linguist Noam Chomsky, who also wrote _Syntactic Structures_, the pioneering work searching for the formal grammar underlying human language. This is one of the charms of computing: it may be that the hammer you reach for every day in fact came from a spaceship.\n", "\n", "When you are writing a regular expression, the best way to start is just to try it against one example at first. Let's use the `findall` method to try a regular expression against the filename of the `fname` object:" ] @@ -196,9 +196,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This regular expression plucks out all the characters leading up to the last underscore character, as long as the subsequence characters are numerical digits and then the jpeg file extension.\n", + "This regular expression plucks out all the characters leading up to the last underscore character, as long as the subsequence characters are numerical digits and then the JPEG file extension.\n", "\n", - "Now that we confirmed the regular expression works for the example, let's use it to label the whole dataset. Fastai comes with many classes to help you with your labelling. For labelling with regular expressions, we can use the `RegexLabeller` class. We can use this in the data block API that we saw in <> (in fact, we nearly always use the data block API--it's so much more flexible than the simple factory methods we saw in <>):" + "Now that we confirmed the regular expression works for the example, let's use it to label the whole dataset. Fastai comes with many classes to help with labeling. For labeling with regular expressions, we can use the `RegexLabeller` class. In this example we use the data block API we saw in <> (in fact, we nearly always use the data block API--it's so much more flexible than the simple factory methods we saw in <>):" ] }, { @@ -227,7 +227,7 @@ "batch_tfms=aug_transforms(size=224, min_scale=0.75)\n", "```\n", "\n", - "These lines implement a fastai data augmentation strategy which we call *presizing*. Presizing is a particular way to do image augmentation, which is designed to minimize data destruction while maintaining good performance." + "These lines implement a fastai data augmentation strategy which we call *presizing*. Presizing is a particular way to do image augmentation that is designed to minimize data destruction while maintaining good performance." ] }, { @@ -241,14 +241,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We need our images to have the same dimensions, so that they can collate into tensors to be passed to the GPU. We also want to minimize the number of distinct augmentation computations we perform. So the performance requirement suggests that we should, where possible, compose our augmentation transforms into fewer transforms (to reduce the number of computations, and reduce the number of lossy operations) and transform the images into uniform sizes (to run compute efficiently on the GPU).\n", + "We need our images to have the same dimensions, so that they can collate into tensors to be passed to the GPU. We also want to minimize the number of distinct augmentation computations we perform. The performance requirement suggests that we should, where possible, compose our augmentation transforms into fewer transforms (to reduce the number of computations and the number of lossy operations) and transform the images into uniform sizes (for more efficient processing on the GPU).\n", "\n", "The challenge is that, if performed after resizing down to the augmented size, various common data augmentation transforms might introduce spurious empty zones, degrade data, or both. For instance, rotating an image by 45 degrees fills corner regions of the new bounds with emptyness, which will not teach the model anything. Many rotation and zooming operations will require interpolating to create pixels. These interpolated pixels are derived from the original image data but are still of lower quality.\n", "\n", - "To workaround these challenges, presizing adopts two strategies that are shown in <>:\n", + "To work around these challenges, presizing adopts two strategies that are shown in <>:\n", "\n", - "1. First, resizing images to relatively \"large dimensions\" that is, dimensions significantly larger than the target training dimensions. \n", - "1. Second, composing all of the common augmentation operations (including a resize to the final target size) into one, and performing the combined operation on the GPU only once at the end of processing, rather than performing them individually and interpolating multiple times.\n", + "1. Resize images to relatively \"large\" dimensions--that is, dimensions significantly larger than the target training dimensions. \n", + "1. Compose all of the common augmentation operations (including a resize to the final target size) into one, and perform the combined operation on the GPU only once at the end of processing, rather than performing the operations individually and interpolating multiple times.\n", "\n", "The first step, the resize, creates images large enough that they have spare margin to allow further augmentation transforms on their inner regions without creating empty zones. This transformation works by resizing to a square, using a large crop size. On the training set, the crop area is chosen randomly, and the size of the crop is selected to cover the entire width or height of the image, whichever is smaller.\n", "\n", @@ -268,19 +268,19 @@ "source": [ "This picture shows the two steps:\n", "\n", - "1. *Crop full width or height*: This is in `item_tfms`, so it's applied to each individual image before it is copied to the GPU. It's used to ensure all images are the same size. On the training set, the crop area is chosen randomly. On the validation set, the center square of the image is always chosen\n", - "2. *Random crop and augment*: This is in `batch_tfms`, so it's applied to a batch all at once on the GPU, which means it's fast. On the validation set, only the resize to the final size needed for the model is done here. On the training set, the random crop and any other augmentation is done first.\n", + "1. *Crop full width or height*: This is in `item_tfms`, so it's applied to each individual image before it is copied to the GPU. It's used to ensure all images are the same size. On the training set, the crop area is chosen randomly. On the validation set, the center square of the image is always chosen.\n", + "2. *Random crop and augment*: This is in `batch_tfms`, so it's applied to a batch all at once on the GPU, which means it's fast. On the validation set, only the resize to the final size needed for the model is done here. On the training set, the random crop and any other augmentations are done first.\n", "\n", - "To implement this process in fastai you use `Resize` as an item transform with a large size, and `RandomResizedCrop` as a batch transform with a smaller size. `RandomResizedCrop` will be added for you if you include the `min_scale` parameter in your `aug_transform` function, as you see in the `DataBlock` call above. Alternatively, you can use `pad` or `squish` instead of `crop` (the default) for the initial `Resize`.\n", + "To implement this process in fastai you use `Resize` as an item transform with a large size, and `RandomResizedCrop` as a batch transform with a smaller size. `RandomResizedCrop` will be added for you if you include the `min_scale` parameter in your `aug_transforms` function, as was done in the `DataBlock` call in the previous section. Alternatively, you can use `pad` or `squish` instead of `crop` (the default) for the initial `Resize`.\n", "\n", - "You can see in this example the difference between an image which has been zoomed, interpolated, rotated, and then interpolated again on the right (which is the approach used by all other deep learning libraries), compared to an image which has been zoomed and rotated as one operation, and then interpolated just once on the left (the fastai approach):" + "<> shows the difference between an image that has been zoomed, interpolated, rotated, and then interpolated again (which is the approach used by all other deep learning libraries), shown here on the right, and an image that has been zoomed and rotated as one operation and then interpolated just once on the left (the fastai approach), shown here on the left." ] }, { "cell_type": "code", "execution_count": null, "metadata": { - "hide_input": true + "hide_input": false }, "outputs": [ { @@ -298,6 +298,8 @@ ], "source": [ "#hide_input\n", + "#id interpolations\n", + "#caption A comparison of fastai's data augmentation strategy (left) and the traditional approach (right).\n", "dblock1 = DataBlock(blocks=(ImageBlock(), CategoryBlock()),\n", " get_y=parent_label,\n", " item_tfms=Resize(460))\n", @@ -324,23 +326,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can see here that the image on the right is less well defined, and has reflection padding artifacts in the bottom left, and the grass in the top left has disappeared entirely. We find that in practice using presizing significantly improves the accuracy of models, and often results in speedups too.\n", + "You can see that the image on the right is less well defined and has reflection padding artifacts in the bottom-left corner; also, the grass iat the top left has disappeared entirely. We find that in practice using presizing significantly improves the accuracy of models, and often results in speedups too.\n", "\n", - "Checking your data looks right is extremely important before training a model. There are simple ways to do this (and debug if needed) in the fastai library, let's look at them now." + "The fastai library also provides simple ways to check your data looks right before training a model, which is an extremely important step. We'll look at those next." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Checking and debugging a DataBlock" + "### Checking and Debugging a DataBlock" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can never just assume that our code is working perfectly. Writing a `DataBlock` is just like writing a blueprint. You will get an error message if you have a syntax error somewhere in your code but you have no guaranty that your template is going to work on your source of data as you intend. The first thing to do before we trying to train a model is to use the `show_batch` method and have a look at your data:" + "We can never just assume that our code is working perfectly. Writing a `DataBlock` is just like writing a blueprint. You will get an error message if you have a syntax error somewhere in your code, but you have no guarantee that your template is going to work on your data source as you intend. So, before training a model you should always check your data. You can do this using the `show_batch` method:" ] }, { @@ -369,9 +371,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Have a look at each image, and check that each one seems to have the correct label for that breed of pet. Often, data scientists work with data with which they are not as familiar as domain experts may be: for instance, I actually don't know what a lot of these pet breeds are. Since I am not an expert on pet breeds, I would use Google images at this point to search for a few of these breeds, and make sure the images looks similar to what I see in this output.\n", + "Take a look at each image, and check that each one seems to have the correct label for that breed of pet. Often, data scientists work with data with which they are not as familiar as domain experts may be: for instance, I actually don't know what a lot of these pet breeds are. Since I am not an expert on pet breeds, I would use Google images at this point to search for a few of these breeds, and make sure the images look similar to what I see in this output.\n", "\n", - "If you made a mistake while building your `DataBlock` it is very likely you won't see it before this step. To debug this, we encourage you to use the `summary` method. It will attempt to create a batch from the source you give it, with a lot of details. Also, if it fails, you will see exactly at which point the error happens, and the library will try to give you some help. For instance, one common mistake is to forget to put a `Resize` transform, ending up with pictures of different sizes and not able to batch them. Here is what the summary would look like in that case (note that the exact text may have changed since the time of writing, but it will give you an idea):" + "If you made a mistake while building your `DataBlock`, it is very likely you won't see it before this step. To debug this, we encourage you to use the `summary` method. It will attempt to create a batch from the source you give it, with a lot of details. Also, if it fails, you will see exactly at which point the error happens, and the library will try to give you some help. For instance, one common mistake is to forget to use a `Resize` transform, so you en up with pictures of different sizes and are not able to batch them. Here is what the summary would look like in that case (note that the exact text may have changed since the time of writing, but it will give you an idea):" ] }, { @@ -514,9 +516,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can see exactly how we gathered the data and split it, how we went from a filename to a *sample* (the tuple image, category), then what item transforms were applied and how it failed to collate those samples in a batch (because of the different shapes). \n", + "You can see exactly how we gathered the data and split it, how we went from a filename to a *sample* (the tuple (image, category)), then what item transforms were applied and how it failed to collate those samples in a batch (because of the different shapes). \n", "\n", - "Once you think your data looks right, we generally recommend the next step should be creating a simple model. We often see people procrastinate the training of an actual model for far too long. As a result, they don't actually get to find out what their baseline results look like. Perhaps it doesn't need lots of fancy domain specific engineering. Or perhaps the data doesn't seem to train it all. These are things that you want to know as soon as possible. So we will use the same simple model that we used in <>:" + "Once you think your data looks right, we generally recommend the next step should be using to train a simple model. We often see people put off the training of an actual model for far too long. As a result, they don't actually find out what their baseline results look like. Perhaps your probem doesn't need lots of fancy domain-specific engineering. Or perhaps the data doesn't seem to train the model all. These are things that you want to know as soon as possible. For this initial test, we'll use the same simple model that we used in <>:" ] }, { @@ -603,42 +605,42 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we've briefly discussed before, the table shown when we fit a model shows us the results after each epoch of training. Remember, an epoch is one complete pass through all of the images in the data. The columns shown are the average loss over the items of the training set, the loss on the validation set, and any metrics that you requested — in this case, the error rate.\n", + "As we've briefly discussed before, the table shown when we fit a model shows us the results after each epoch of training. Remember, an epoch is one complete pass through all of the images in the data. The columns shown are the average loss over the items of the training set, the loss on the validation set, and any metrics that we requested—in this case, the error rate.\n", "\n", - "Remember that *loss* is whatever function we've decided to use to optimise the parameters of our model. But we haven't actually told fastai what loss function we want to use. So what is it doing? Fastai will generally try to select an appropriate loss function based on what kind of data and model you are using. In this case you have image data, and a categorical outcome, so fastai will default to using *cross entropy loss*." + "Remember that *loss* is whatever function we've decided to use to optimize the parameters of our model. But we haven't actually told fastai what loss function we want to use. So what is it doing? Fastai will generally try to select an appropriate loss function based on what kind of data and model you are using. In this case we have image data and a categorical outcome, so fastai will default to using *cross-entropy loss*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Cross entropy loss" + "## Cross-Entropy Loss" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "*Cross entropy loss* is a loss function which is similar to the loss function we used in the previous chapter, but (as we'll see) has two benefits:\n", + "*Cross-entropy loss* is a loss function that is similar to the one we used in the previous chapter, but (as we'll see) has two benefits:\n", "\n", - "- It works even when our dependent variable has more than two categories\n", + "- It works even when our dependent variable has more than two categories.\n", "- It results in faster and more reliable training.\n", "\n", - "In order to understand how cross entropy loss works for dependent variables with more than two categories, we first have to understand what the actual data and activations that are seen by the loss function look like." + "In order to understand how cross-entropy loss works for dependent variables with more than two categories, we first have to understand what the actual data and activations that are seen by the loss function look like." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Viewing activations and labels" + "### Viewing Activations and Labels" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's have a look at the activations of our model. To actually get a batch of real data from our DataLoaders, we can use the `one_batch` method:" + "Let's take a look at the activations of our model. To actually get a batch of real data from our `DataLoaders`, we can use the `one_batch` method:" ] }, { @@ -654,7 +656,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you see, this returns the dependent, and the independent variables, as a mini-batch. Let's see what is actually contained in our dependent variable:" + "As you see, this returns the dependent and independent variables, as a mini-batch. Let's see what is actually contained in our dependent variable:" ] }, { @@ -682,7 +684,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Our batch size is 64, so we have 64 rows in this tensor. Each row is a single integer between zero and 36, representing our 37 possible pet breeds. We can view the predictions (that is, the activations of the final layer of our neural network) using `Learner.get_preds`. This function either takes a dataset index (0 for train and 1 for valid) or an iterator of batches. Thus, we can pass it a simple list with our batch to get our predictions. It returns predictions and targets by default, but since we already have the targets, we can effectively ignore them by assigning to the special variable `_`:" + "Our batch size is 64, so we have 64 rows in this tensor. Each row is a single integer between 0 and 36, representing our 37 possible pet breeds. We can view the predictions (that is, the activations of the final layer of our neural network) using `Learner.get_preds`. This function either takes a dataset index (0 for train and 1 for valid) or an iterator of batches. Thus, we can pass it a simple list with our batch to get our predictions. It returns predictions and targets by default, but since we already have the targets, we can effectively ignore them by assigning to the special variable `_`:" ] }, { @@ -722,7 +724,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The actual predictions are 37 probabilities between zero and one, which add up to 1 in total." + "The actual predictions are 37 probabilities between 0 and 1, which add up to 1 in total:" ] }, { @@ -749,7 +751,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To transform the activations of our model into predictions like this, we used something called the softmax activation function." + "To transform the activations of our model into predictions like this, we used something called the *softmax* activation function." ] }, { @@ -763,9 +765,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In our classification model, an activation function called *softmax* in the final layer is used to ensure that the activations are between zero and one, and that they sum to one.\n", + "In our classification model, we use the softmax activation function in the final layer to ensure that the activations are all between 0 and 1, and that they sum to 1.\n", "\n", - "Softmax is similar to the sigmoid function, which we saw earlier; sigmoid looks like this:" + "Softmax is similar to the sigmoid function, which we saw earlier. As a reminder sigmoid looks like this:" ] }, { @@ -794,9 +796,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can apply this function to a single column of activations from a neural network, and get back a column of numbers between zero and one. So it's a very useful activation function for our final layer.\n", + "We can apply this function to a single column of activations from a neural network, and get back a column of numbers between 0 and 1, so it's a very useful activation function for our final layer.\n", "\n", - "Now think about what happens if we want to have more categories in our target (such as our 37 pet breeds). That means we'll need more activations than just a single column: we need an activation *per category*. We can create, for instance, a neural net that predicts \"3\"s and \"7\"s that returns two activations, one for each class--this will be a good first step towards creating the more general approach. Let's just use some random numbers with a standard deviation of 2 (so we multiply `randn` by 2) for this example, assuming we have six images and two possible categories (where the first columns represents \"3\"s and the second is \"7\"s):" + "Now think about what happens if we want to have more categories in our target (such as our 37 pet breeds). That means we'll need more activations than just a single column: we need an activation *per category*. We can create, for instance, a neural net that predicts 3s and 7s that returns two activations, one for each class--this will be a good first step toward creating the more general approach. Let's just use some random numbers with a standard deviation of 2 (so we multiply `randn` by 2) for this example, assuming we have 6 images and 2 possible categories (where the first column represents 3s and the second is 7s):" ] }, { @@ -839,7 +841,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can't just take the sigmoid of this directly, since we don't get rows that add to one (i.e we want the probability of being a \"3\" plus the probability of being a \"7\" to add to one):" + "We can't just take the sigmoid of this directly, since we don't get rows that add to 1 (i.e., we want the probability of being a 3 plus the probability of being a 7 to add up to 1):" ] }, { @@ -871,11 +873,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In <>, the neural net created a single activation per image, which we passed through the sigmoid function. That single activation represented the confidence that the input was a \"3\". Binary problems are a special case of classification problems, because the target can be treated as a single boolean value, as we did in `mnist_loss`. Binary problems can also be thought of as part of the more general group of classifiers with any number of categories--where in this case we happen to have 2 categories. As we saw in the bear classifier, our neural net will return one activation per category.\n", + "In <>, our neural net created a single activation per image, which we passed through the `sigmoid` function. That single activation represented the model's confidence that the input was a 3. Binary problems are a special case of classification problems, because the target can be treated as a single boolean value, as we did in `mnist_loss`. But binary problems can also be thought of in the context of the more general group of classifiers with any number of categories: in this case, we happen to have two categories. As we saw in the bear classifier, our neural net will return one activation per category.\n", "\n", - "So in the binary case, what do those activations really indicate? A single pair of activations simply indicates the *relative* confidence of being a \"3\" versus being a \"7\". The overall values, whether they are both high, or both low, don't matter--all that matters is which is higher, and by how much.\n", + "So in the binary case, what do those activations really indicate? A single pair of activations simply indicates the *relative* confidence of the input being a 3 versus being a 7. The overall values, whether they are both high, or both low, don't matter--all that matters is which is higher, and by how much.\n", "\n", - "We would expect that since this is just another way of representing the same problem (in the binary case) that we would be able to use sigmoid directly on the two-activation version of our neural net. And indeed we can! We can just take the *difference* between the neural net activations, because that reflects how much more sure we are of being a \"3\" vs a \"7\", and then take the sigmoid of that:" + "We would expect that since this is just another way of representing the same problem, that we would be able to use `sigmoid` directly on the two-activation version of our neural net. And indeed we can! We can just take the *difference* between the neural net activations, because that reflects how much more sure we are of the input being a 3 than a 7, and then take the sigmoid of that:" ] }, { @@ -902,7 +904,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The second column (the probability of being a \"7\") will then just be that subtracted from one. We need a way to do all this that also works for more than two columns. It turns out that this function, called `softmax`, is exactly that:\n", + "The second column (the probability of it being a 7) will then just be that value subtracted from 1. Now, we need a way to do all this that also works for more than two columns. It turns out that this function, called `softmax`, is exactly that:\n", "\n", "``` python\n", "def softmax(x): return exp(x) / exp(x).sum(dim=1, keepdim=True)\n", @@ -913,14 +915,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: Exponential function (exp): Literally defined as `e**x`, where `e` is a special number approximately equal to 2.718. It is the inverse of the natural logarithm function. Note that `exp` is always positive, and it increases *very* rapidly!" + "> jargon: Exponential function (exp): Literally defined as `e**x`, where `e` is a special number approximately equal to 2.718. It is the inverse of the natural logarithm function. Note that `exp` is always positive, and it increases _very_ rapidly!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's check that `softmax` returns the same values as `sigmoid` for the first column, and that subtracted from one for the second column:" + "Let's check that `softmax` returns the same values as `sigmoid` for the first column, and those values subtracted from 1 for the second column:" ] }, { @@ -953,41 +955,41 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Softmax is the multi-category equivalent of sigmoid--we have to use it any time we have more than two categories, and the probabilities of the categories must add to one. (We often use it even when there's just two categories, just to make things a bit more consistent.) We could create other functions that have the properties that all activations are between zero and one, and sum to one; however, no other function has the same relationship to the sigmoid function, which we've seen is smooth and symmetric. Also, we'll see shortly that the softmax function works well hand-in-hand with the loss function we will look at in the next section.\n", + "`softmax` is the multi-category equivalent of `sigmoid`--we have to use it any time we have more than two categories and the probabilities of the categories must add to 1, and we often use it even when there are just two categories, just to make things a bit more consistent. We could create other functions that have the properties that all activations are between 0 and 1, and sum to 1; however, no other function has the same relationship to the sigmoid function, which we've seen is smooth and symmetric. Also, we'll see shortly that the softmax function works well hand-in-hand with the loss function we will look at in the next section.\n", "\n", - "If we have three output activations, such as in our bear classifier, calculating softmax for a single bear image would then look like something like this:" + "If we have three output activations, such as in our bear classifier, calculating softmax for a single bear image would then look like something like <>." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Bear" + "\"Bear" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "What does this function do in practice? Taking the exponential ensures all our numbers are positive, and then dividing by the sum ensures we are going to have a bunch of numbers that add up to one. The exponential also has a nice property: if one of the numbers in our activations `x` is slightly bigger than the others, the exponential will amplify this (since it grows, well... exponentially) which means that in the softmax, that number will be closer to 1. \n", + "What does this function do in practice? Taking the exponential ensures all our numbers are positive, and then dividing by the sum ensures we are going to have a bunch of numbers that add up to 1. The exponential also has a nice property: if one of the numbers in our activations `x` is slightly bigger than the others, the exponential will amplify this (since it grows, well... exponentially), which means that in the softmax, that number will be closer to 1. \n", "\n", - "Intuitively, the Softmax function *really* wants to pick one class among the others, so it's ideal for training a classifier when we know each picture has a definite label. (Note that it may be less ideal during inference, as you might want your model to sometimes tell you it doesn't recognize any of the classes that it has seen during training, and not pick a class because it has a slightly bigger activation score. In this case, it might be better to train a model using multiple binary output columns, each using a sigmoid activation.)\n", + "Intuitively, the softmax function *really* wants to pick one class among the others, so it's ideal for training a classifier when we know each picture has a definite label. (Note that it may be less ideal during inference, as you might want your model to sometimes tell you it doesn't recognize any of the classes that it has seen during training, and not pick a class because it has a slightly bigger activation score. In this case, it might be better to train a model using multiple binary output columns, each using a sigmoid activation.)\n", "\n", - "Softmax is the first part of the cross entropy loss, the second part is log likeklihood. " + "Softmax is the first part of the cross-entropy loss--the second part is log likeklihood. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Log likelihood" + "### Log Likelihood" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "When we calculated the loss for our MNIST example in the last chapter we used.\n", + "When we calculated the loss for our MNIST example in the last chapter we used:\n", "\n", "```python\n", "def mnist_loss(inputs, targets):\n", @@ -995,9 +997,9 @@ " return torch.where(targets==1, 1-inputs, inputs).mean()\n", "```\n", "\n", - "Just like we moved from sigmoid to softmax, we need to extend the loss function to work with more than just binary classification, to classifying any number of categories (in this case, we have 37 categories). Our activations, after softmax, are between zero and one, and sum to one for each row in the batch of predictions. Our targets are integers between 0 and 36.\n", + "Just as we moved from sigmoid to softmax, we need to extend the loss function to work with more than just binary classification--it needs to be able to classify any number of categories (in this case, we have 37 categories). Our activations, after softmax, are between 0 and 1, and sum to 1 for each row in the batch of predictions. Our targets are integers between 0 and 36.\n", "\n", - "In the binary case, we used `torch.where` to select between `inputs` and `1-inputs`. When we treat a binary classification as a general classification problem with two categories, it actually becomes even easier, because (as we saw in the softmax section) we now have two columns, containing the equivalent of `inputs` and `1-inputs`. So all we need to do is select from the appropriate column. Let's try to implement this in PyTorch. For our synthetic \"3\"s and \"7\" example, let's say these are our labels:" + "In the binary case, we used `torch.where` to select between `inputs` and `1-inputs`. When we treat a binary classification as a general classification problem with two categories, it actually becomes even easier, because (as we saw in the previous section) we now have two columns, containing the equivalent of `inputs` and `1-inputs`. So, all we need to do is select from the appropriate column. Let's try to implement this in PyTorch. For our synthetic 3s and 7s example, let's say these are our labels:" ] }, { @@ -1013,7 +1015,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and these are the softmax activations:" + "and these are the softmax activations:" ] }, { @@ -1045,7 +1047,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Then for each item of `targ` we can use that to select that column of `sm_acts` using tensor indexing, like so:" + "Then for each item of `targ` we can use that to select the appropriate column of `sm_acts` using tensor indexing, like so:" ] }, { @@ -1155,11 +1157,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Looking at this table, you can see that the final column can be calculated by taking the `targ` and `idx` columns as indices into the 2-column matrix containing the `3` and `7` columns. That's what `sm_acts[idx, targ]` is actually doing.\n", + "Looking at this table, you can see that the final column can be calculated by taking the `targ` and `idx` columns as indices into the two-column matrix containing the `3` and `7` columns. That's what `sm_acts[idx, targ]` is actually doing.\n", "\n", - "The really interesting thing here is that this actually works just as well with more than two columns. To see this, consider what would happen if we added an activation column above for every digit (zero through nine), and then `targ` contained a number from zero to nine. As long as the activation columns sum to one (as they will, if we use softmax), then we'll have a loss function that shows how well we're predicting each digit.\n", + "The really interesting thing here is that this actually works just as well with more than two columns. To see this, consider what would happen if we added an activation column for every digit (0 through 9), and then `targ` contained a number from 0 to 9. As long as the activation columns sum to 1 (as they will, if we use softmax), then we'll have a loss function that shows how well we're predicting each digit.\n", "\n", - "We're only picking the loss from the column containing the correct label. We don't need to consider the other columns, because by the definition of softmax, they add up to one minus the activation corresponding to the correct label. Therefore, making the activation for the correct label as high as possible, must mean we're also decreasing the activations of the remaining columns.\n", + "We're only picking the loss from the column containing the correct label. We don't need to consider the other columns, because by the definition of softmax, they add up to 1 minus the activation corresponding to the correct label. Therefore, making the activation for the correct label as high as possible must mean we're also decreasing the activations of the remaining columns.\n", "\n", "PyTorch provides a function that does exactly the same thing as `sm_acts[range(n), targ]` (except it takes the negative, because when applying the log afterward, we will have negative numbers), called `nll_loss` (*NLL* stands for *negative log likelihood*):" ] @@ -1208,21 +1210,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Despite the name being negative log likelihood, this PyTorch function does not take the log (we will see why in the next section). First, let's see why taking the logarithm can be useful." + "Despite its name, this PyTorch function does not take the log. We'll see why in the next section, but first, let's see why taking the logarithm can be useful." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Taking the `log`" + "### Taking the Log" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This does work quite well as a loss function, but we can make it a bit better. The problem is that we are using probabilities, and probabilities cannot be smaller than zero, or greater than one. But that means that our model will not care about whether it predicts 0.99 versus 0.999, because those numbers are so close together. But in another sense, 0.999 is 10 times more confident than 0.99. So we wish to transform our numbers between zero and one to instead be between negative infinity and infinity. There is a function available in maths which does exactly this: the logarithm (available as `torch.log`). It is not defined for numbers less than zero, and looks like this:" + "The function we saw in the previous section works quite well as a loss function, but we can make it a bit better. The problem is that we are using probabilities, and probabilities cannot be smaller than 0 or greater than 1. That means that our model will not care whether it predicts 0.99 or 0.999. Indeed, those numbers are so close together--but in another sense, 0.999 is 10 times more confident than 0.99. So, we want to transform our numbers between 0 and 1 to instead be between negative infinity and infinity. There is a mathematical function that does exactly this: the *logarithm* (available as `torch.log`). It is not defined for numbers less than 0, and looks like this:" ] }, { @@ -1260,11 +1262,11 @@ "\n", "In this case, we're assuming that `log(y,b)` returns *log y base b*. However, PyTorch actually doesn't define `log` this way: `log` in Python uses the special number `e` (2.718...) as the base.\n", "\n", - "Perhaps a logarithm is something that you have not thought about for the last 20 years or so. But it's a mathematical idea which is going to be really critical for many things in deep learning, so now would be a great time to refresh your memory. The key thing to know about logarithms is this relationship:\n", + "Perhaps a logarithm is something that you have not thought about for the last 20 years or so. But it's a mathematical idea that is going to be really critical for many things in deep learning, so now would be a great time to refresh your memory. The key thing to know about logarithms is this relationship:\n", "\n", " log(a*b) = log(a)+log(b)\n", "\n", - "When we see it in that format, it looks a bit boring; but have a think about what this really means. It means that logarithms increase linearly when the underlying signal increases exponentially or multiplicatively. This is used for instance in the Richter scale of earthquake severity, and the dB scale of noise levels. It's also often used on financial charts, where we want to show compound growth rates more clearly. Computer scientists love using logarithms, because it means that modification, which can create really really large and really really small numbers, can be replaced by addition, which is much less likely to result in scales which are difficult for our computer to handle." + "When we see it in that format, it looks a bit boring; but think about what this really means. It means that logarithms increase linearly when the underlying signal increases exponentially or multiplicatively. This is used, for instance, in the Richter scale of earthquake severity, and the dB scale of noise levels. It's also often used on financial charts, where we want to show compound growth rates more clearly. Computer scientists love using logarithms, because it means that modification, which can create really really large and really really small numbers, can be replaced by addition, which is much less likely to result in scales that are difficult for our computers to handle." ] }, { @@ -1285,14 +1287,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> warning: The \"NLL\" in \"nll_loss\" stands for \"negative log likelihood\", but it doesn't actually take the log at all! It assumes you have _already_ taken the log. PyTorch has a function called \"log_softmax\" which combines \"log\" and \"softmax\" in a fast and accurate way." + "> warning: Confusing Name, Beware: The nll in `nll_loss` stands for \"negative log likelihood,\" but it doesn't actually take the log at all! It assumes you have _already_ taken the log. PyTorch has a function called `log_softmax` that combines `log` and `softmax` in a fast and accurate way. `nll_loss` is deigned to be used after `log_softmax`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "When we first take the softmax, and then the log likelihood of that, that combination is called *cross entropy loss*. In PyTorch, this is available as `nn.CrossEntropyLoss` (which, in practice, actually does `log_softmax` and then `nll_loss`)." + "When we first take the softmax, and then the log likelihood of that, that combination is called *cross-entropy loss*. In PyTorch, this is available as `nn.CrossEntropyLoss` (which, in practice, actually does `log_softmax` and then `nll_loss`):" ] }, { @@ -1335,7 +1337,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "All PyTorch loss functions are provided in two forms: the class form seen above, and also a plain functional form, available in the `F` namespace:" + "All PyTorch loss functions are provided in two forms, the class just shown above, and also a plain functional form, available in the `F` namespace:" ] }, { @@ -1362,7 +1364,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Either one works fine and can be used in any situation. We've noticed that most people tend to use the class version, and that's more often used in PyTorch official docs and examples, so we'll tend to use that too.\n", + "Either one works fine and can be used in any situation. We've noticed that most people tend to use the class version, and that's more often used in PyTorch's official docs and examples, so we'll tend to use that too.\n", "\n", "By default PyTorch loss functions take the mean of the loss of all items. You can use `reduction='none'` to disable that:" ] @@ -1391,14 +1393,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> s: An interesting feature about cross entropy loss appears when we consider its gradient. The gradient of `cross_entropy(a,b)` is just `softmax(a)-b`. Since `softmax(a)` is just the final activation of the model, that means that the gradient is proportional to the difference between the prediction and the target. This is the same as mean squared error in regression (assuming there's no final activation function such as that added by `y_range`), since the gradient of `(a-b)**2` is `2*(a-b)`. Since the gradient is linear, that means that we won't see sudden jumps or exponential increases in gradients, which should lead to smoother training of models." + "> s: An interesting feature about cross-entropy loss appears when we consider its gradient. The gradient of `cross_entropy(a,b)` is just `softmax(a)-b`. Since `softmax(a)` is just the final activation of the model, that means that the gradient is proportional to the difference between the prediction and the target. This is the same as mean squared error in regression (assuming there's no final activation function such as that added by `y_range`), since the gradient of `(a-b)**2` is `2*(a-b)`. Because the gradient is linear, that means we won't see sudden jumps or exponential increases in gradients, which should lead to smoother training of models." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We have now seen all the pieces hidden behind our loss function. While it gives us a number on how well (or bad) our model is doing, it does nothing to help us know if it's actually any good. Let's now see some ways to interpret our model predictions." + "We have now seen all the pieces hidden behind our loss function. But while this puts a number on how well (or badly) our model is doing, it does nothing to help us know if it's actually any good. Let's now see some ways to interpret our model's predictions." ] }, { @@ -1412,7 +1414,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It's very hard to interpret loss functions directly, because they are designed to be things which computers can differentiate and optimise, not things that people can understand. That's why we have metrics. These are not used in the optimisation process, but just used to help us poor humans understand what's going on. In this case, our accuracy is looking pretty good already! So where are we making mistakes?\n", + "It's very hard to interpret loss functions directly, because they are designed to be things computers can differentiate and optimize, not things that people can understand. That's why we have metrics. These are not used in the optimization process, but just to help us poor humans understand what's going on. In this case, our accuracy is looking pretty good already! So where are we making mistakes?\n", "\n", "We saw in <> that we can use a confusion matrix to see where our model is doing well, and where it's doing badly:" ] @@ -1455,7 +1457,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Oh dear, in this case, a confusion matrix is very hard to read. We have 37 different breeds of pet, which means we have 37×37 entries in this giant matrix! Instead, we can use the `most_confused` method, which just shows us the cells of the confusion matrix with the most incorrect predictions (here with at least 5 or more):" + "Oh dear--in this case, a confusion matrix is very hard to read. We have 37 different breeds of pet, which means we have 37×37 entries in this giant matrix! Instead, we can use the `most_confused` method, which just shows us the cells of the confusion matrix with the most incorrect predictions (here, with at least 5 or more):" ] }, { @@ -1486,16 +1488,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Since we are not pet breed experts, it is hard for us to know whether these category errors reflect actual difficulties in recognising breeds. So again, we turn to Google. A little bit of googling tells us that the most common category errors shown here are actually breed differences which even expert breeders sometimes disagree about. So this gives us some comfort that we are on the right track.\n", + "Since we are not pet breed experts, it is hard for us to know whether these category errors reflect actual difficulties in recognizing breeds. So again, we turn to Google. A little bit of Googling tells us that the most common category errors shown here are actually breed differences that even expert breeders sometimes disagree about. So this gives us some comfort that we are on the right track.\n", "\n", - "So we seem to have a good baseline. What can we do now to make it even better?" + "We seem to have a good baseline. What can we do now to make it even better?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Improving our model" + "## Improving Our Model" ] }, { @@ -1504,21 +1506,21 @@ "source": [ "We will now look at a range of techniques to improve the training of our model and make it better. While doing so, we will explain a little bit more about transfer learning and how to fine-tune our pretrained model as best as possible, without breaking the pretrained weights.\n", "\n", - "The first thing we need to set when training a model is the learning rate. We saw in the previous chapter that it needed to be just right to train as efficiently as possible, so how do we pick a good one? fastai provides something called the Learning rate finder for this." + "The first thing we need to set when training a model is the learning rate. We saw in the previous chapter that it needs to be just right to train as efficiently as possible, so how do we pick a good one? fastai provides a tool for this." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Learning rate finder" + "### The Learning Rate Finder" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "One of the most important things we can do when training a model is to make sure that we have the right learning rate. If our learning rate is too low, it can take many many epochs. Not only does this waste time, but it also means that we may have problems with overfitting, because every time we do a complete pass through the data, we give our model a chance to memorise it.\n", + "One of the most important things we can do when training a model is to make sure that we have the right learning rate. If our learning rate is too low, it can take many, many epochs to train our model. Not only does this waste time, but it also means that we may have problems with overfitting, because every time we do a complete pass through the data, we give our model a chance to memorize it.\n", "\n", "So let's just make our learning rate really high, right? Sure, let's try that and see what happens:" ] @@ -1600,14 +1602,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "That did not look good. Here's what happened. The optimiser stepped in the correct direction, but it stepped so far that it totally overshot the minimum loss. Repeating that multiple times makes it get further and further away, not closer and closer!\n", + "That doesn't look good. Here's what happened. The optimizer stepped in the correct direction, but it stepped so far that it totally overshot the minimum loss. Repeating that multiple times makes it get further and further away, not closer and closer!\n", "\n", - "What do we do to find the perfect learning rate, not too high, and not too low? In 2015 the researcher Leslie Smith came up with a brilliant idea, called the *learning rate finder*. His idea was to start with a very very small learning rate, something so small that we would never expect it to be too big to handle. We use that for one mini batch, find what the losses are afterwards, and then increase the learning rate by some percentage (e.g. doubling it each time). Then we do another mini batch, track the loss, and double the learning rate again. We keep doing this until the loss gets worse, instead of better. This is the point where we know we have gone too far. We then select a learning rate a bit lower than this point. Our advice is to pick either:\n", + "What do we do to find the perfect learning rate--not too high, and not too low? In 2015 the researcher Leslie Smith came up with a brilliant idea, called the *learning rate finder*. His idea was to start with a very, very small learning rate, something so small that we would never expect it to be too big to handle. We use that for one mini-batch, find what the losses are afterwards, and then increase the learning rate by some percentage (e.g., doubling it each time). Then we do another mini-batch, track the loss, and double the learning rate again. We keep doing this until the loss gets worse, instead of better. This is the point where we know we have gone too far. We then select a learning rate a bit lower than this point. Our advice is to pick either:\n", "\n", - "- one order of magnitude less than where the minimum loss was achieved (i.e. the minimum divided by 10)\n", - "- the last point where the loss was clearly decreasing. \n", + "- One order of magnitude less than where the minimum loss was achieved (i.e., the minimum divided by 10)\n", + "- The last point where the loss was clearly decreasing \n", "\n", - "The Learning Rate Finder computes those points on the curve to help you. Both these rules usually give around the same value. In the first chapter, we didn't specify a learning rate, using the default value from the fastai library (which is 1e-3)." + "The learning rate finder computes those points on the curve to help you. Both these rules usually give around the same value. In the first chapter, we didn't specify a learning rate, using the default value from the fastai library (which is 1e-3):" ] }, { @@ -1664,16 +1666,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see on this plot that in the range 1e-6 to 1e-3, nothing really happens and the model doesn't train. Then the loss starts to decrease until it reaches a minimum and then increases again. We don't want a learning rate greater than 1e-1 as it will give a training that diverges (you can try for yourself) but 1e-1 is already too high: at this stage we left the period where the loss was decreasing steadily.\n", + "We can see on this plot that in the range 1e-6 to 1e-3, nothing really happens and the model doesn't train. Then the loss starts to decrease until it reaches a minimum, and then increases again. We don't want a learning rate greater than 1e-1 as it will give a training that diverges like the one before (you can try for yourself), but 1e-1 is already too high: at this stage we've left the period where the loss was decreasing steadily.\n", "\n", - "In this learning rate plot it appears that a learning rate around 3e-3 would be appropriate, so let's choose that." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "> Note: The learning rate finder plot has a logarithmic scale, which is why the middle point between 1e-3 and 1e-2 is between 3e-3 and 4e-3. This is because we care mostly about the order of magnitude of the learning rate." + "In this learning rate plot it appears that a learning rate around 3e-3 would be appropriate, so let's choose that:" ] }, { @@ -1760,38 +1755,45 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Something really interesting about the learning rate finder is that it was only discovered in 2015. Neural networks have been under development since the 1950s. Throughout that time finding a good learning rate has been, perhaps, the most important and challenging issue for practitioners. The idea does not require any advanced maths, giant computing resources, huge datasets, or anything else that would make it inaccessible to any curious researcher. Furthermore, Leslie Smith, was not part of some exclusive Silicon Valley lab, but was working as a naval researcher. All of this is to say: breakthrough work in deep learning absolutely does not require access to vast resources, elite teams, or advanced mathematical ideas. There is lots of work still to be done which requires just a bit of common sense, creativity, and tenacity." + "> Note: Logarithmic Scale: The learning rate finder plot has a logarithmic scale, which is why the middle point between 1e-3 and 1e-2 is between 3e-3 and 4e-3. This is because we care mostly about the order of magnitude of the learning rate." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It's interesting that the learning rate finder was only discovered in 2015, while neural networks have been under development since the 1950s. Throughout that time finding a good learning rate has been, perhaps, the most important and challenging issue for practitioners. The soltuon does not require any advanced maths, giant computing resources, huge datasets, or anything else that would make it inaccessible to any curious researcher. Furthermore, Leslie Smith, was not part of some exclusive Silicon Valley lab, but was working as a naval researcher. All of this is to say: breakthrough work in deep learning absolutely does not require access to vast resources, elite teams, or advanced mathematical ideas. There is lots of work still to be done that requires just a bit of common sense, creativity, and tenacity." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now that we have a good learning rate to train our model, let's look at how we can finetune the weights of a pretrained model." + "Now that we have a good learning rate to train our model, let's look at how we can fine-tune the weights of a pretrained model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Unfreezing and transfer learning" + "### Unfreezing and Transfer Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We discussed briefly in <> how transfer learning works. We saw that the basic idea is that a pretrained model, trained potentially on millions of data points (such as ImageNet), is fine tuned for some other task. But what does this really mean?\n", + "We discussed briefly in <> how transfer learning works. We saw that the basic idea is that a pretrained model, trained potentially on millions of data points (such as ImageNet), is fine-tuned for some other task. But what does this really mean?\n", "\n", - "We now know that a convolutional neural network consists of many layers with a non-linear activation function between each and one or more final linear layers, with an activation function such as softmax at the very end. The final linear layer uses a matrix with enough columns such that the output size is the same as the number of classes in our model (assuming that we are doing classification).\n", + "We now know that a convolutional neural network consists of many linear layers with a nonlinear activation function between each pair, followed by one or more final linear layers with an activation function such as softmax at the very end. The final linear layer uses a matrix with enough columns such that the output size is the same as the number of classes in our model (assuming that we are doing classification).\n", "\n", - "This final linear layer is unlikely to be of any use for us, when we are fine tuning in a transfer learning setting, because it is specifically designed to classify the categories in the original pretraining dataset. So when we do transfer learning we remove it, throw it away, and replace it with a new linear layer with the correct number of outputs for our desired task (in this case, there would be 37 activations).\n", + "This final linear layer is unlikely to be of any use for us when we are fine-tuning in a transfer learning setting, because it is specifically designed to classify the categories in the original pretraining dataset. So when we do transfer learning we remove it, throw it away, and replace it with a new linear layer with the correct number of outputs for our desired task (in this case, there would be 37 activations).\n", "\n", - "This newly added linear layer will have entirely random weights. Therefore, our model prior to fine tuning has entirely random outputs. But that does not mean that it is an entirely random model! All of the layers prior to the last one have been carefully trained to be good at image classification tasks in general. As we saw in the images from the Zeiler and Fergus paper in <> (see <> and followings), the first few layers encode very general concepts such as finding gradients and edges, and later layers encode concepts that are still very useful for us, such as finding eyeballs and fur.\n", + "This newly added linear layer will have entirely random weights. Therefore, our model prior to fine-tuning has entirely random outputs. But that does not mean that it is an entirely random model! All of the layers prior to the last one have been carefully trained to be good at image classification tasks in general. As we saw in the images from the [Zeiler and Fergus paper](https://arxiv.org/pdf/1311.2901.pdf) in <> (see <> through <>), the first few layers encode very general concepts, such as finding gradients and edges, and later layers encode concepts that are still very useful for us, such as finding eyeballs and fur.\n", "\n", "We want to train a model in such a way that we allow it to remember all of these generally useful ideas from the pretrained model, use them to solve our particular task (classify pet breeds), and only adjust them as required for the specifics of our particular task.\n", "\n", - "Our challenge when fine tuning is to replace the random weights in our added linear layers with weights that correctly achieve our desired task (classifying pet breeds) without breaking the carefully pretrained weights and the other layers. There is actually a very simple trick to allow this to happen: tell the optimiser to only update the weights in those randomly added final layers. Don't change the weights in the rest of the neural network at all. This is called *freezing* those pretrained layers." + "Our challenge when fine-tuning is to replace the random weights in our added linear layers with weights that correctly achieve our desired task (classifying pet breeds) without breaking the carefully pretrained weights and the other layers. There is actually a very simple trick to allow this to happen: tell the optimizer to only update the weights in those randomly added final layers. Don't change the weights in the rest of the neural network at all. This is called *freezing* those pretrained layers." ] }, { @@ -1800,10 +1802,10 @@ "source": [ "When we create a model from a pretrained network fastai automatically freezes all of the pretrained layers for us. When we call the `fine_tune` method fastai does two things:\n", "\n", - "- train the randomly added layers for one epoch, with all other layers frozen ;\n", - "- unfreeze all of the layers, and train them all for the number of epochs requested.\n", + "- Trains the randomly added layers for one epoch, with all other layers frozen\n", + "- Unfreezes all of the layers, and trains them all for the number of epochs requested\n", "\n", - "Although this is a reasonable default approach, it is likely that for your particular dataset you may get better results by doing things slightly differently. The `fine_tune` method has a number of parameters you can use to change its behaviour, but it might be easiest for you to just call the underlying methods directly if you want to get some custom behavior. Remember that you can see the source code for the method by using the following syntax:\n", + "Although this is a reasonable default approach, it is likely that for your particular dataset you may get better results by doing things slightly differently. The `fine_tune` method has a number of parameters you can use to change its behavior, but it might be easiest for you to just call the underlying methods directly if you want to get some custom behavior. Remember that you can see the source code for the method by using the following syntax:\n", "\n", " learn.fine_tune??\n", "\n", @@ -1879,7 +1881,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "And then we will unfreeze the model:" + "Then we'll unfreeze the model:" ] }, { @@ -1895,7 +1897,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and run `lr_find` again, because having more layers to train, and weights that have already been trained for 3 epochs, means our previously found learning rate isn't appropriate any more:" + "and run `lr_find` again, because having more layers to train, and weights that have already been trained for three epochs, means our previously found learning rate isn't appropriate any more:" ] }, { @@ -1944,7 +1946,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Note that the graph is a little different from when we had random weights: we don't have that sharp descent that indicates the model is training. That's because our model has been trained already. Here we have a somewhat flat area before a sharp increase, and we should take a point well before that sharp increase, for instance 1e-5. The point with the maximum gradient isn't what we look for here and should be ignored.\n", + "Note that the graph is a little different from when we had random weights: we don't have that sharp descent that indicates the model is training. That's because our model has been trained already. Here we have a somewhat flat area before a sharp increase, and we should take a point well before that sharp increase--for instance, 1e-5. The point with the maximum gradient isn't what we look for here and should be ignored.\n", "\n", "Let's train at a suitable learning rate:" ] @@ -2029,39 +2031,39 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This has improved our model a bit, but there's more we can do. The deepest layers of our pretrained model might not need as high a learning rate as the last ones, so we should probably use different learning rates for those, something called discriminative learning rates." + "This has improved our model a bit, but there's more we can do. The deepest layers of our pretrained model might not need as high a learning rate as the last ones, so we should probably use different learning rates for those--this is known as using *discriminative learning rates*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Discriminative learning rates" + "### Discriminative Learning Rates" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Even after we unfreeze, we still care a lot about the quality of those pretrained weights. We would not expect that the best learning rate for those pretrained parameters would be as high as the randomly added parameters — even after we have tuned those randomly added parameters for a few epochs. Remember, the pretrained weights have been trained for hundreds of epochs, on millions of images.\n", + "Even after we unfreeze, we still care a lot about the quality of those pretrained weights. We would not expect that the best learning rate for those pretrained parameters would be as high as for the randomly added parameters, even after we have tuned those randomly added parameters for a few epochs. Remember, the pretrained weights have been trained for hundreds of epochs, on millions of images.\n", "\n", - "In addition, do you remember the images we saw in <>, showing what each layer learns? The first layer learns very simple foundations, like edge and gradient detectors; these are likely to be just as useful for nearly any task. The later layers learn much more complex concepts, like \"eye\" and \"sunset\", which might not be useful in your task at all (maybe you're classifying car models, for instance). So it makes sense to let the later layers fine-tune more quickly than earlier layers.\n", + "In addition, do you remember the images we saw in <>, showing what each layer learns? The first layer learns very simple foundations, like edge and gradient detectors; these are likely to be just as useful for nearly any task. The later layers learn much more complex concepts, like \"eye\" and \"sunset,\" which might not be useful in your task at all (maybe you're classifying car models, for instance). So it makes sense to let the later layers fine-tune more quickly than earlier layers.\n", "\n", - "Therefore, fastai by default does something called *discriminative learning rates*. This was originally developed in the ULMFiT approach to NLP transfer learning that we introduced in <>. Like many good ideas in deep learning, it is extremely simple: use a lower learning rate for the early layers of the neural network, and a higher learning rate for the later layers (and especially the randomly added layers). The idea is based on insights developed by Jason Yosinski, who showed in 2014 that when transfer learning different layers of a neural network should train at different speeds, as seen in <>." + "Therefore, fastai's default approach is to use discriminative learning rates. This was originally developed in the ULMFiT approach to NLP transfer learning that we will introduce in <>. Like many good ideas in deep learning, it is extremely simple: use a lower learning rate for the early layers of the neural network, and a higher learning rate for the later layers (and especially the randomly added layers). The idea is based on insights developed by [Jason Yosinski](https://arxiv.org/abs/1411.1792), who showed in 2014 that with transfer learning different layers of a neural network should train at different speeds, as seen in <>." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\"Impact" + "\"Impact" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Fastai lets you pass a Python *slice* object anywhere that a learning rate is expected. The first value past will be the learning rate in the earliest layer of the neural network, and the second value will be the learning rate in the final layer. The layers in between will have learning rates that are multiplicatively equidistant throughout that range. Let's use this approach to replicate the previous training, but this time we'll only set the *lowest* layer of our net to a learning rate of `1e-6`; the other layers will scale up to `1e-4`. Let's train for a while and see what happens." + "Fastai lets you pass a Python `slice` object anywhere that a learning rate is expected. The first value passed will be the learning rate in the earliest layer of the neural network, and the second value will be the learning rate in the final layer. The layers in between will have learning rates that are multiplicatively equidistant throughout that range. Let's use this approach to replicate the previous training, but this time we'll only set the *lowest* layer of our net to a learning rate of 1e-6; the other layers will scale up to 1e-4. Let's train for a while and see what happens:" ] }, { @@ -2234,7 +2236,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now the fine tuning is working great!\n", + "Now the fine-tuning is working great!\n", "\n", "Fastai can show us a graph of the training and validation loss:" ] @@ -2265,64 +2267,64 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see, the training loss keeps getting better and better. But notice that eventually the validation loss improvement slows, and sometimes even gets worse! This is the point at which the model is starting to over fit. In particular, the model is becoming overconfident of its predictions. But this does *not* mean that it is getting less accurate, necessarily. Have a look at the table of training results per epoch, and you will often see that the accuracy continues improving, even as the validation loss gets worse. In the end what matters is your accuracy, or more generally your chosen metrics, not the loss. The loss is just the function we've given the computer to help us to optimise." + "As you can see, the training loss keeps getting better and better. But notice that eventually the validation loss improvement slows, and sometimes even gets worse! This is the point at which the model is starting to over fit. In particular, the model is becoming overconfident of its predictions. But this does *not* mean that it is getting less accurate, necessarily. Take a look at the table of training results per epoch, and you will often see that the accuracy continues improving, even as the validation loss gets worse. In the end what matters is your accuracy, or more generally your chosen metrics, not the loss. The loss is just the function we've given the computer to help us to optimize." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Another decision you have to make when training the model is for how long." + "Another decision you have to make when training the model is for how long to train for. We'll consider that next." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Selecting the number of epochs" + "### Selecting the Number of Epochs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Often you will find that you are limited by time, rather than generalisation and accuracy, when choosing how many epochs to train for. So your first approach to training should be to simply pick a number of epochs that will train in the amount of time that you are happy to wait for. Have a look at the training and validation loss plots, like showed above, and in particular your metrics, and if you see that they are still getting better even in your final epochs, then you know that you have not trained for too long.\n", + "Often you will find that you are limited by time, rather than generalization and accuracy, when choosing how many epochs to train for. So your first approach to training should be to simply pick a number of epochs that will train in the amount of time that you are happy to wait for. Then look at the training and validation loss plots, as shown above, and in particular your metrics, and if you see that they are still getting better even in your final epochs, then you know that you have not trained for too long.\n", "\n", - "On the other hand, you may well see that the metrics you have chosen are really getting worse at the end of training. Remember, it's not just that we're looking for the validation loss to get worse, but your actual metrics. Your validation loss will first of all during training get worse because it gets overconfident, and only later will get worse because it is incorrectly memorising the data. We only care in practice about the latter issue. Our loss function is just something, remember, that we used to allow our optimiser to have something it could differentiate and optimise; it's not actually the thing we care about in practice.\n", + "On the other hand, you may well see that the metrics you have chosen are really getting worse at the end of training. Remember, it's not just that we're looking for the validation loss to get worse, but the actual metrics. Your validation loss will first get worse during training because the model gets overconfident, and only later will get worse because it is incorrectly memorizing the data. We only care in practice about the latter issue. Remember, our loss function is just something that we use to allow our optimizer to have something it can differentiate and optimize; it's not actually the thing we care about in practice.\n", "\n", - "Before the days of 1cycle training it was very common to save the model at the end of each epoch, and then select whichever model had the best accuracy, out of all of the models saved in each epoch. This is known as *early stopping*. However, with one cycle training, it is very unlikely to give you the best answer, because those epochs in the middle occur before the learning rate has had a chance to reach the small values, where it can really find the best result. Therefore, if you find that you have overfit, what you should actually do is to retrain your model from scratch, and this time select a total number of epochs based on where your previous best results were found.\n", + "Before the days of 1cycle training it was very common to save the model at the end of each epoch, and then select whichever model had the best accuracy out of all of the models saved in each epoch. This is known as *early stopping*. However, this is very unlikely to give you the best answer, because those epochs in the middle occur before the learning rate has had a chance to reach the small values, where it can really find the best result. Therefore, if you find that you have overfit, what you should actually do is retrain your model from scratch, and this time select a total number of epochs based on where your previous best results were found.\n", "\n", - "If we've got the time to train for more epochs, we may want to instead use that time to train more parameters, that is use a deeper architecture." + "If you have the time to train for more epochs, you may want to instead use that time to train more parameters--that is, use a deeper architecture." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Deeper architectures" + "### Deeper Architectures" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In general, a model with more parameters can model your data more accurately. (There are lots and lots of caveats to this generalisation, and it depends on the specifics of the architectures you are using, but it is a reasonable rule of thumb for now.) For most of the architectures that we will be seeing in this book you can create larger versions of them by simply adding more layers. However, since we want to use pretrained models, we need to make sure that we choose a number of layers that has been already pretrained for us.\n", + "In general, a model with more parameters can model your data more accurately. (There are lots and lots of caveats to this generalization, and it depends on the specifics of the architectures you are using, but it is a reasonable rule of thumb for now.) For most of the architectures that we will be seeing in this book, you can create larger versions of them by simply adding more layers. However, since we want to use pretrained models, we need to make sure that we choose a number of layers that have already been pretrained for us.\n", "\n", - "This is why, in practice, architectures tend to come in a small number of variants. For instance, the resnet architecture that we are using in this chapter comes in 18, 34, 50, 101, and 152 layer variants, pre-trained on ImageNet. A larger (more layers and parameters; sometimes described as the \"capacity\" of a model) version of a resnet will always be able to give us a better training loss, but it can suffer more from overfitting, because it has more parameters to over fit with.\n", + "This is why, in practice, architectures tend to come in a small number of variants. For instance, the ResNet architecture that we are using in this chapter comes in variants with 18, 34, 50, 101, and 152 layer, pretrained on ImageNet. A larger (more layers and parameters; sometimes described as the \"capacity\" of a model) version of a ResNet will always be able to give us a better training loss, but it can suffer more from overfitting, because it has more parameters to overfit with.\n", "\n", - "In general, a bigger model has the ability to better capture the real underlying relationships in your data, and also to capture and memorise the specific details of your individual images.\n", + "In general, a bigger model has the ability to better capture the real underlying relationships in your data, and also to capture and memorize the specific details of your individual images.\n", "\n", - "However, using a deeper model is going to require more GPU RAM, so we may need to lower the size of our batches to avoid *out-of-memory errors*. This happens when you try to fit too much inside your GPU and looks like:\n", + "However, using a deeper model is going to require more GPU RAM, so you may need to lower the size of your batches to avoid an *out-of-memory error*. This happens when you try to fit too much inside your GPU and looks like:\n", "\n", "```\n", "Cuda runtime error: out of memory\n", "```\n", "\n", - "You may have to restart your notebook when this happens, and the way to solve it is to use a smaller *batch size*, which means we will pass smaller groups of images at any given time through our model. We can pass the batch size we want to the call creating our `DataLoaders` with `bs=`.\n", + "You may have to restart your notebook when this happens. The way to solve it is to use a smaller batch size, which means passing smaller groups of images at any given time through your model. You can pass the batch size you want to the call creating your `DataLoaders` with `bs=`.\n", "\n", - "The other downside of deeper architectures is that they take quite a bit longer to train. One thing that can speed things up a lot is *mixed precision training*. This refers to using less precise numbers (*half precision floating point*, also called *fp16*) where possible during training. As we are writing these words (early 2020) nearly all current NVIDIA GPUs support a special feature called *tensor cores* which can dramatically (2x-3x) speed up neural network training. They also require a lot less GPU memory. To enable this feature in fastai, just add `to_fp16()` after your `Learner` creation (you also need to import the module).\n", + "The other downside of deeper architectures is that they take quite a bit longer to train. One technique that can speed things up a lot is *mixed-precision training*. This refers to using less-precise numbers (*half-precision floating point*, also called *fp16*) where possible during training. As we are writing these words in early 2020, nearly all current NVIDIA GPUs support a special feature called *tensor cores* that can dramatically speed up neural network training, by 2-3x. They also require a lot less GPU memory. To enable this feature in fastai, just add `to_fp16()` after your `Learner` creation (you also need to import the module).\n", "\n", - "You can't really know ahead of time what the best architecture for your particular problem is, until you try training some. So let's try a resnet 50 now with mixed precision:" + "You can't really know ahead of time what the best architecture for your particular problem is--you need to try training some. So let's try a ResNet-50 now with mixed precision:" ] }, { @@ -2461,20 +2463,20 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Summary" + "## Conclusion" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In this chapter we learned some important practical tips, both for getting our image data ready for modeling (presizing; data block summary) and for fitting the model (learning rate finder, unfreezing, discriminative learning rates, setting the number of epochs, and using deeper architectures). Using these tools will help you to build more accurate image models, more quickly.\n", + "In this chapter you learned some important practical tips, both for getting your image data ready for modeling (presizing, data block summary) and for fitting the model (learning rate finder, unfreezing, discriminative learning rates, setting the number of epochs, and using deeper architectures). Using these tools will help you to build more accurate image models, more quickly.\n", "\n", - "We also learned about cross entropy loss. This part of the book is worth spending plenty of time on. You aren't likely to need to actually implement cross entropy loss from scratch yourself in practice, but it's really important you understand the inputs to and output from that function, because it (or a variant of it, as we'll see in the next chapter) is used in nearly every classification model. So when you want to debug a model, or put a model in production, or improve the accuracy of a model, you're going to need to be able to look at its activations and loss, and understand what's going on, and why. You can't do that properly if you don't understand your loss function.\n", + "We also discussed cross-entropy loss. This part of the book is worth spending plenty of time on. You aren't likely to need to actually implement cross-entropy loss from scratch yourself in practice, but it's really important you understand the inputs to and output from that function, because it (or a variant of it, as we'll see in the next chapter) is used in nearly every classification model. So when you want to debug a model, or put a model in production, or improve the accuracy of a model, you're going to need to be able to look at its activations and loss, and understand what's going on, and why. You can't do that properly if you don't understand your loss function.\n", "\n", - "If cross entropy loss hasn't \"clicked\" for you just yet, don't worry--you'll get there! First, go back to the last chapter and make sure you really understand `mnist_loss`. Then work gradually through the cells of the notebook for this chapter, where we step through each piece of cross entropy loss. Make sure you understand what each calculation is doing, and why. Try creating some small tensors yourself and pass them into the functions, to see what they return.\n", + "If cross-entropy loss hasn't \"clicked\" for you just yet, don't worry--you'll get there! First, go back to the last chapter and make sure you really understand `mnist_loss`. Then work gradually through the cells of the notebook for this chapter, where we step through each piece of cross-entropy loss. Make sure you understand what each calculation is doing, and why. Try creating some small tensors yourself and pass them into the functions, to see what they return.\n", "\n", - "Remember: the choices made in cross entropy loss are not the only possible choices that could have been made. Just like when we looked at regression, we could choose between mean squared error and mean absolute difference (L1), we could change the details inside cross entropy loss too. If you have other ideas for possible functions that you think might work, feel free to give them a try in this chapter's notebook! (Fair warning though: you'll probably find that the model will be slower to train, and less accurate. That's because the gradient of cross entropy loss is proportional to the difference between the activation and the target, so SGD always gets a nicely scaled step for the weights.)" + "Remember: the choices made in the implementation of cross-entropy loss are not the only possible choices that could have been made. Just like when we looked at regression we could choose between mean squared error and mean absolute difference (L1). If you have other ideas for possible functions that you think might work, feel free to give them a try in this chapter's notebook! (Fair warning though: you'll probably find that the model will be slower to train, and less accurate. That's because the gradient of cross-entropy loss is proportional to the difference between the activation and the target, so SGD always gets a nicely scaled step for the weights.)" ] }, { @@ -2489,35 +2491,35 @@ "metadata": {}, "source": [ "1. Why do we first resize to a large size on the CPU, and then to a smaller size on the GPU?\n", - "1. If you are not familiar with regular expressions, find a regular expression tutorial, and some problem sets, and complete them. Have a look on the book website for suggestions.\n", + "1. If you are not familiar with regular expressions, find a regular expression tutorial, and some problem sets, and complete them. Have a look on the book's website for suggestions.\n", "1. What are the two ways in which data is most commonly provided, for most deep learning datasets?\n", "1. Look up the documentation for `L` and try using a few of the new methods is that it adds.\n", - "1. Look up the documentation for the Python pathlib module and try using a few methods of the Path class.\n", + "1. Look up the documentation for the Python `pathlib` module and try using a few methods of the `Path` class.\n", "1. Give two examples of ways that image transformations can degrade the quality of the data.\n", - "1. What method does fastai provide to view the data in a DataLoader?\n", - "1. What method does fastai provide to help you debug a DataBlock?\n", + "1. What method does fastai provide to view the data in a `DataLoaders`?\n", + "1. What method does fastai provide to help you debug a `DataBlock`?\n", "1. Should you hold off on training a model until you have thoroughly cleaned your data?\n", - "1. What are the two pieces that are combined into cross entropy loss in PyTorch?\n", + "1. What are the two pieces that are combined into cross-entropy loss in PyTorch?\n", "1. What are the two properties of activations that softmax ensures? Why is this important?\n", "1. When might you want your activations to not have these two properties?\n", - "1. Calculate the \"exp\" and \"softmax\" columns of <> yourself (i.e. in a spreadsheet, with a calculator, or in a notebook).\n", - "1. Why can't we use torch.where to create a loss function for datasets where our label can have more than two categories?\n", + "1. Calculate the `exp` and `softmax` columns of <> yourself (i.e., in a spreadsheet, with a calculator, or in a notebook).\n", + "1. Why can't we use `torch.where` to create a loss function for datasets where our label can have more than two categories?\n", "1. What is the value of log(-2)? Why?\n", "1. What are two good rules of thumb for picking a learning rate from the learning rate finder?\n", - "1. What two steps does the fine_tune method do?\n", - "1. In Jupyter notebook, how do you get the source code for a method or function?\n", + "1. What two steps does the `fine_tune` method do?\n", + "1. In Jupyter Notebook, how do you get the source code for a method or function?\n", "1. What are discriminative learning rates?\n", - "1. How is a Python slice object interpreted when passed as a learning rate to fastai?\n", - "1. Why is early stopping a poor choice when using one cycle training?\n", - "1. What is the difference between resnet 50 and resnet101?\n", - "1. What does to_fp16 do?" + "1. How is a Python `slice` object interpreted when passed as a learning rate to fastai?\n", + "1. Why is early stopping a poor choice when using 1cycle training?\n", + "1. What is the difference between `resnet50` and `resnet101`?\n", + "1. What does `to_fp16` do?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { @@ -2525,7 +2527,7 @@ "metadata": {}, "source": [ "1. Find the paper by Leslie Smith that introduced the learning rate finder, and read it.\n", - "1. See if you can improve the accuracy of the classifier in this chapter. What's the best accuracy you can achieve? Have a look on the forums and book website to see what other students have achieved with this dataset, and how they did it." + "1. See if you can improve the accuracy of the classifier in this chapter. What's the best accuracy you can achieve? Look on the forums and the book's website to see what other students have achieved with this dataset, and how they did it." ] }, { diff --git a/06_multicat.ipynb b/06_multicat.ipynb index 59192d367..512154dd9 100644 --- a/06_multicat.ipynb +++ b/06_multicat.ipynb @@ -21,16 +21,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Other computer vision problems" + "# Other Computer Vision Problems" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In the previous chapter we learnt some important practical techniques for training models in practice. Issues like selecting learning rates and the number of epochs are very important to getting good results.\n", + "In the previous chapter you learned some important practical techniques for training models in practice. COnsiderations like selecting learning rates and the number of epochs are very important to getting good results.\n", "\n", - "In this chapter we are going to look at other types of computer vision problems, multi-label classification and regression. The first one is when you want to predict more than one label per image (or sometimes none at all) and the second one is when your labels are one (or several) number, a quantity instead of a category.\n", + "In this chapter we are going to look at two other types of computer vision problems: multi-label classification and regression. The first one is when you want to predict more than one label per image (or sometimes none at all), and the second is when your labels are one or several numbers--a quantity instead of a category.\n", "\n", "In the process will study more deeply the output activations, targets, and loss functions in deep learning models." ] @@ -39,34 +39,34 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Multi-label classification" + "## Multi-Label Classification" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Multi-label classification refers to the problem of identifying the categories of objects in an image, where you may not have exactly one type of object in the image. There may be more than one kind of object, or there may be no objects at all in the classes that you are looking for.\n", + "Multi-label classification refers to the problem of identifying the categories of objects in images that may not contain exactly one type of object. There may be more than one kind of object, or there may be no objects at all in the classes that you are looking for.\n", "\n", - "For instance, this would have been a great approach for our bear classifier. One problem with the bear classifier that we rolled out in <> is that if a user uploaded something that wasn't any kind of bear, the model would still say it was either a grizzly, black, or teddy bear — it had no ability to predict \"not a bear at all\". In fact, after we have completed this chapter, it would be a great exercise for you to go back to your image classifier application, and try to retrain it using the multi-label technique. And then, tested by passing in an image which is not of any of your recognised classes.\n", + "For instance, this would have been a great approach for our bear classifier. One problem with the bear classifier that we rolled out in <> was that if a user uploaded something that wasn't any kind of bear, the model would still say it was either a grizzly, black, or teddy bear—it had no ability to predict \"not a bear at all.\" In fact, after we have completed this chapter, it would be a great exercise for you to go back to your image classifier application, and try to retrain it using the multi-label technique, then test it by passing in an image that is not of any of your recognized classes.\n", "\n", - "In practice, we have not seen many examples of people training multi-label classifiers for this purpose. But we very often see both users and developers complaining about this problem. It appears that this simple solution is not at all widely understood or appreciated. Because in practice it is probably more common to have some images with zero matches or more than one match, we should probably expect in practice that multi-label classifiers are more widely applicable than single label classifiers.\n", + "In practice, we have not seen many examples of people training multi-label classifiers for this purpose--but we very often see both users and developers complaining about this problem. It appears that this simple solution is not at all widely understood or appreciated! Because in practice it is probably more common to have some images with zero matches or more than one match, we should probably expect in practice that multi-label classifiers are more widely applicable than single-label classifiers.\n", "\n", - "First, we'll seee what a multi-label dataset looks like, then we'll explain how to get it ready for our model. Then we'll see that the architecture does not change from last chapter, only the loss function does. Let's start with the data." + "First, let's see what a multi-label dataset looks like, then we'll explain how to get it ready for our model. You'll see that the architecture of the model does not change from the last chapter; only the loss function does. Let's start with the data." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The data" + "### The Data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "For our example we are going to use the *Pascal* dataset, which can have more than one kind of classified object per image.\n", + "For our example we are going to use the PASCAL dataset, which can have more than one kind of classified object per image.\n", "\n", "We begin by downloading and extracting the dataset as per usual:" ] @@ -85,7 +85,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This dataset is different to the ones we have seen before, and that it is not structured by file name or folder, but instead comes with a CSV (comma separated values) file telling us what labels to use for each image. We can have a look at the CSV file by reading it into a Pandas DataFrame:" + "This dataset is different from the ones we have seen before, in that it is not structured by filename or folder but instead comes with a CSV (comma-separated values) file telling us what labels to use for each image. We can inspect the CSV file by reading it into a Pandas DataFrame:" ] }, { @@ -177,7 +177,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can see that the list of categories in each image is shown as a space delimited string." + "As you can see, the list of categories in each image is shown as a space-delimited string." ] }, { @@ -191,9 +191,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "No, it’s not actually a panda! *Pandas* is a Python library that is used to manipulate and analyse tabular and timeseries data. The main class is `DataFrame`, which represents a table of rows and columns. You can get a DataFrame from a CSV file, a database table, python dictionaries, and many other sources. In Jupyter, a DataFrame is output as a formatted table, as you see above.\n", + "No, it’s not actually a panda! *Pandas* is a Python library that is used to manipulate and analyze tabular and time series data. The main class is `DataFrame`, which represents a table of rows and columns. You can get a DataFrame from a CSV file, a database table, Python dictionaries, and many other sources. In Jupyter, a DataFrame is output as a formatted table, as shown here.\n", "\n", - "You can access rows and columns of a DataFrame with the `iloc` property, which lets you access rows and columns as if it is a matrix:" + "You can access rows and columns of a DataFrame with the `iloc` property, as if it were a matrix:" ] }, { @@ -248,7 +248,7 @@ ], "source": [ "df.iloc[0,:]\n", - "# Trailing ‘:’s are always optional (in numpy, PyTorch, pandas, etc),\n", + "# Trailing :s are always optional (in numpy, pytorch, pandas, etc.),\n", "# so this is equivalent:\n", "df.iloc[0]" ] @@ -427,7 +427,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Pandas is a fast and flexible library, and is an important part of every data scientist’s Python toolbox. Unfortunately, its API can be rather confusing and surprising, so it takes a while to get familiar with it. If you haven’t used Pandas before, we’d suggest going through a tutorial; we are particularly fond of the book “*Python for Data Analysis*” by Wes McKinney, the creator of Pandas. It also covers other important libraries like matplotlib and numpy. We will try to briefly describe Pandas functionality we use as we come across it, but will not go into the level of detail of McKinney’s book." + "Pandas is a fast and flexible library, and an important part of every data scientist’s Python toolbox. Unfortunately, its API can be rather confusing and surprising, so it takes a while to get familiar with it. If you haven’t used Pandas before, we’d suggest going through a tutorial; we are particularly fond of the book [*Python for Data Analysis*](http://shop.oreilly.com/product/0636920023784.do) by Wes McKinney, the creator of Pandas (O'Reilly). It also covers other important libraries like `matplotlib` and `numpy`. We will try to briefly describe Pandas functionality we use as we come across it, but will not go into the level of detail of McKinney’s book." ] }, { @@ -448,7 +448,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Constructing a data block" + "### Constructing a DataBlock" ] }, { @@ -459,106 +459,8 @@ "\n", "As we have seen, PyTorch and fastai have two main classes for representing and accessing a training set or validation set:\n", "\n", - "- `Dataset`:: a collection which returns a tuple of your independent and dependent variable for a single item\n", - "- `DataLoader`:: an iterator which provides a stream of mini batches, where each mini batch is a couple of a batch of independent variables and a batch of dependent variables" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "((0, 'a'), 26)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "a = list(enumerate(string.ascii_lowercase))\n", - "a[0],len(a)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "(tensor([25, 11, 4, 1, 7, 21, 19, 0]),\n", - " ('z', 'l', 'e', 'b', 'h', 'v', 't', 'a'))" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dl_a = DataLoader(a, batch_size=8, shuffle=True)\n", - "b = first(dl_a)\n", - "b" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "[(tensor(25), 'z'),\n", - " (tensor(11), 'l'),\n", - " (tensor(4), 'e'),\n", - " (tensor(1), 'b'),\n", - " (tensor(7), 'h'),\n", - " (tensor(21), 'v'),\n", - " (tensor(19), 't'),\n", - " (tensor(0), 'a')]" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "list(zip(b[0],b[1]))" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "[(tensor(25), 'z'),\n", - " (tensor(11), 'l'),\n", - " (tensor(4), 'e'),\n", - " (tensor(1), 'b'),\n", - " (tensor(7), 'h'),\n", - " (tensor(21), 'v'),\n", - " (tensor(19), 't'),\n", - " (tensor(0), 'a')]" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "list(zip(*b))" + "- `Dataset`:: A collection that returns a tuple of your independent and dependent variable for a single item\n", + "- `DataLoader`:: An iterator that provides a stream of mini-batches, where each mini-batch is a couple of a batch of independent variables and a batch of dependent variables" ] }, { @@ -567,161 +469,17 @@ "source": [ "On top of these, fastai provides two classes for bringing your training and validation sets together:\n", "\n", - "- `Datasets`:: an object which contains a training `Dataset` and a validation `Dataset`\n", - "- `DataLoaders`:: an object which contains a training `DataLoader` and a validation `DataLoader`\n", + "- `Datasets`:: An object that contains a training `Dataset` and a validation `Dataset`\n", + "- `DataLoaders`:: An object that contains a training `DataLoader` and a validation `DataLoader`\n", "\n", - "Since a `DataLoader` builds on top of a `Dataset`, and adds additional functionality to it (collating multiple items into a mini batch), it’s often easiest to start by creating and testing `Datasets`, and then look at `DataLoaders` after that’s working." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('a', 26)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "a = list(string.ascii_lowercase)\n", - "a[0],len(a)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('a',)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a)\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "def f1(o): return o+'a'\n", - "def f2(o): return o+'b'" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('aa',)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a, [[f1]])\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('aab',)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a, [[f1,f2]])\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('aa', 'ab')" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a, [[f1],[f2]])\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "dls = DataLoaders.from_dsets(dss, batch_size=4)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "(('da', 'aa', 'ea', 'na'), ('db', 'ab', 'eb', 'nb'))" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "first(dls.train)" + "Since a `DataLoader` builds on top of a `Dataset` and adds additional functionality to it (collating multiple items into a mini-batch), it’s often easiest to start by creating and testing `Datasets`, and then look at `DataLoaders` after that’s working." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "When we create a `DataBlock`, we build up gradually, step-by-step, and use the notebook to check our data along the way. This is a great way to make sure that you maintain momentum as you are coding, and that you keep an eye out for any problems. It’s easy to debug, because you know that if there are any problems, it is in the line of code you just typed!\n", + "When we create a `DataBlock`, we build up gradually, step by step, and use the notebook to check our data along the way. This is a great way to make sure that you maintain momentum as you are coding, and that you keep an eye out for any problems. It’s easy to debug, because you know that if a problem arises, it is in the line of code you just typed!\n", "\n", "Let’s start with the simplest case, which is a data block created with no parameters:" ] @@ -739,7 +497,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can create a `Datasets` object from this. The only thing needed is a source, in this case, our dataframe:" + "We can create a `Datasets` object from this. The only thing needed is a source--in this case, our DataFrame:" ] }, { @@ -755,7 +513,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "this contains a `train` and a `valid` dataset, which we can index into:" + "This contains a `train` and a `valid` dataset, which we can index into:" ] }, { @@ -809,7 +567,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see, this simply returns a row of the dataframe, twice. This is because by default, the datablock assumes we have two things: input and target. We are going to need to grab the appropriate fields from the DataFrame, which we can do by passing `get_x` and `get_y` functions:" + "As you can see, this simply returns a row of the DataFrame, twice. This is because by default, the data block assumes we have two things: input and target. We are going to need to grab the appropriate fields from the DataFrame, which we can do by passing `get_x` and `get_y` functions:" ] }, { @@ -858,7 +616,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see, rather than defining a function in the usual way, we are using Python’s *lambda* keyword. This is just a shortcut for defining and then referring to a function. The above is identical to the following more verbose approach:" + "As you can see, rather than defining a function in the usual way, we are using Python’s `lambda` keyword. This is just a shortcut for defining and then referring to a function. The following more verbose approach is identical:" ] }, { @@ -889,53 +647,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "lambda functions are great for quickly iterating, however they are not compatible with serialization, so we advise you to use the more verbose approach if you want to export your `Learner` after training (they are fine if you are just experimenting)." + "Lambda functions are great for quickly iterating, but they are not compatible with serialization, so we advise you to use the more verbose approach if you want to export your `Learner` after training (lambdas are fine if you are just experimenting)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can see that the independent variable will need to be converted into a complete path, so that we can open it as an image, and the second will need to be split on the space character (which is the default for Python’s split function) so that it becomes a list:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "Path.BASE_PATH = None" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "Path('/home/jhoward/.fastai/data/pascal_2007')" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "path" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "#hide\n", - "Path.BASE_PATH = path" + "We can see that the independent variable will need to be converted into a complete path, so that we can open it as an image, and the dependent variable will need to be split on the space character (which is the default for Python’s `split` function) so that it becomes a list:" ] }, { @@ -967,7 +686,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To actually open the image and do the conversion to tensors, we will need to use a set of transforms; block types will provide us with those. We can use the same block types that we have used previously, with one exception. The `ImageBlock` will work fine again, because we have a path which points to a valid image, but the `CategoryBlock` is not going to work. The problem is: that block returns a single integer. But we need to be able to have multiple labels for each item. To solve this, we use a `MultiCategoryBlock`. This type of block expects to receive a list of strings, as we have in this case, so let’s test it out:" + "To actually open the image and do the conversion to tensors, we will need to use a set of transforms; block types will provide us with those. We can use the same block types that we have used previously, with one exception: the `ImageBlock` will work fine again, because we have a path that points to a valid image, but the `CategoryBlock` is not going to work. The problem is that block returns a single integer, but we need to be able to have multiple labels for each item. To solve this, we use a `MultiCategoryBlock`. This type of block expects to receive a list of strings, as we have in this case, so let’s test it out:" ] }, { @@ -998,21 +717,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see, our list of categories is not encoded in the same way that it was for the regular CategoryBlock. In that case, we had a single integer, representing which category was present, based on its location in our vocab. In this case, however, we instead have a list of zeros, with a one in any position where that category is present. For example, if there is a one in the second and fourth positions, then that means that vocab items two and four are present in this image. This is known as *one hot encoding*. The reason we can’t easily just use a list of category indices, is that each list would be a different length, and PyTorch requires tensors, where everything has to be the same length." + "As you can see, our list of categories is not encoded in the same way that it was for the regular `CategoryBlock`. In that case, we had a single integer representing which category was present, based on its location in our vocab. In this case, however, we instead have a list of zeros, with a one in any position where that category is present. For example, if there is a one in the second and fourth positions, then that means that vocab items two and four are present in this image. This is known as *one-hot encoding*. The reason we can’t easily just use a list of category indices is that each list would be a different length, and PyTorch requires tensors, where everything has to be the same length." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: One hot encoding: using a vector of zeros, with a one in each location that is represented in the data, to encode a list of integers." + "> jargon: One-hot encoding: Using a vector of zeros, with a one in each location that is represented in the data, to encode a list of integers." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let’s check what the categories represent for this example (we are using the convenient torch.where function, which tells us all of the indices where our condition is true or false):" + "Let’s check what the categories represent for this example (we are using the convenient `torch.where` function, which tells us all of the indices where our condition is true or false):" ] }, { @@ -1040,9 +759,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "With numpy arrays, PyTorch tensors, and fastai’s L class, you can index directly using a list or vector, which makes a lot of code (such as this example) much clearer and more concise.\n", + "With NumPy arrays, PyTorch tensors, and fastai’s `L` class, we can index directly using a list or vector, which makes a lot of code (such as this example) much clearer and more concise.\n", "\n", - "We have ignored the column `is_valid` up until now, which means that `DataBlock` has been using a random split by default. To explicitly choose the elements of our validation set, we need to write a function and pass it to `splitter` (or use one of fastai's predefined functions or classes). It will take the items (here our whole dataframe) and must return two (or more) list of integers." + "We have ignored the column `is_valid` up until now, which means that `DataBlock` has been using a random split by default. To explicitly choose the elements of our validation set, we need to write a function and pass it to `splitter` (or use one of fastai's predefined functions or classes). It will take the items (here our whole DataFrame) and must return two (or more) lists of integers:" ] }, { @@ -1081,7 +800,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we have discussed, a `DataLoader` collates the items from a `Dataset` into a mini batch. This is a tuple of tensors, where each tensor simply stacks the items from that location in the `Dataset` item. Now that we have confirmed that the individual items look okay there's one more step we need to ensure we can create our `DataLoaders`, which is to ensure that every item is of the same size. To do this, we can use `RandomResizedCrop`:" + "As we have discussed, a `DataLoader` collates the items from a `Dataset` into a mini-batch. This is a tuple of tensors, where each tensor simply stacks the items from that location in the `Dataset` item. \n", + "\n", + "Now that we have confirmed that the individual items look okay, there's one more step we need to ensure we can create our `DataLoaders`, which is to ensure that every item is of the same size. To do this, we can use `RandomResizedCrop`:" ] }, { @@ -1131,28 +852,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "And remember that if anything goes wrong when you create your `DataLoaders` from your `DataBlock`, or if you want to view exactly what happens with your `DataBlock`, you can use the `summary` method we presented in the last chapter." + "Remember that if anything goes wrong when you create your `DataLoaders` from your `DataBlock`, or if you want to view exactly what happens with your `DataBlock`, you can use the `summary` method we presented in the last chapter." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Our data is now ready for training a model. As we will see, nothing is going to change when we create our `Learner`, but behind the scenes, the fastai library will pick a new loss function for us: binary cross entropy." + "Our data is now ready for training a model. As we will see, nothing is going to change when we create our `Learner`, but behind the scenes, the fastai library will pick a new loss function for us: binary cross-entropy." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Binary cross entropy" + "### Binary Cross-Entropy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now we'll create our `Learner`. We saw in <> that a `Learner` object contains four main things: the model, a `DataLoaders` object, an `Optimizer`, and the loss function to use. We already how our `DataLoaders`, and we can leverage fastai's `resnet` models (which we'll learn how to create from scratch later), and we know how to create an `SGD` optimizer. So let's focus on ensuring we have a suitable loss function. To do this, let's use `cnn_learner` to create a `Learner`, so we can look at its activations:" + "Now we'll create our `Learner`. We saw in <> that a `Learner` object contains four main things: the model, a `DataLoaders` object, an `Optimizer`, and the loss function to use. We already have our `DataLoaders`, we can leverage fastai's `resnet` models (which we'll learn how to create from scratch later), and we know how to create an `SGD` optimizer. So let's focus on ensuring we have a suitable loss function. To do this, let's use `cnn_learner` to create a `Learner`, so we can look at its activations:" ] }, { @@ -1168,7 +889,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We also saw that the model in a `Learner` is generally an object of a class inheriting from `nn.Module`, and that you can call it using parentheses and it will return the activations of a model. You should pass it your independent variable, as a mini batch. We can try it out by grabbing a mini batch from our `DataLoader`, and then passing it to the model:" + "We also saw that the model in a `Learner` is generally an object of a class inheriting from `nn.Module`, and that we can call it using parentheses and it will return the activations of a model. You should pass it your independent variable, as a mini-batch. We can try it out by grabbing a mini batch from our `DataLoader` and then passing it to the model:" ] }, { @@ -1197,7 +918,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Have a think about why `activs` has this shape… We have a batch size of 64. And we need to calculate the probability of each of 20 categories. Here’s what one of those activations looks like:" + "Think about why `activs` has this shape--we have a batch size of 64, and we need to calculate the probability of each of 20 categories. Here’s what one of those activations looks like:" ] }, { @@ -1225,14 +946,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> note: Knowing how to manually get a mini batch and pass it into a model, and look at the activations and loss, is really important for debugging your model. It is also very helpful for learning, so that you can see exactly what is going on." + "> note: Getting Model Activations: Knowing how to manually get a mini-batch and pass it into a model, and look at the activations and loss, is really important for debugging your model. It is also very helpful for learning, so that you can see exactly what is going on." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "They aren’t yet scaled between zero and one. We learned in <> how to scale activations to be between zero and one: the `sigmoid` function. We also saw how to calculate a loss based on this--this is our loss function from <>, with the addition of `log` as discussed in the last chapter:" + "They aren’t yet scaled to between 0 and 1, but we learned how to do that in <>, using the `sigmoid` function. We also saw how to calculate a loss based on this--this is our loss function from <>, with the addition of `log` as discussed in the last chapter:" ] }, { @@ -1250,19 +971,19 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Note that because we have a one-hot encoded dependent variable, we can't directly use `nll_loss` or `softmax` (and therefore we can't use `cross_entropy`):\n", + "Note that because we have a one-hot-encoded dependent variable, we can't directly use `nll_loss` or `softmax` (and therefore we can't use `cross_entropy`):\n", "\n", - "- **softmax**, as we saw, requires that all predictions sum to one, and tends to push one activation to be much larger than the others (due to the use of `exp`); however, we may well have multiple objects that we're confident appear in an image, so restricting the maximum sum of activations to one is not a good idea. By the same reasoning, we may want the sum to be *less* than one, if we don't think *any* of the categories appear in an image.\n", - "- **nll_loss**, as we saw, returns the value of just one activation: the single activation corresponding with the single label for an item. This doesn't make sense when we have multiple labels.\n", + "- `softmax`, as we saw, requires that all predictions sum to 1, and tends to push one activation to be much larger than the others (due to the use of `exp`); however, we may well have multiple objects that we're confident appear in an image, so restricting the maximum sum of activations to 1 is not a good idea. By the same reasoning, we may want the sum to be *less* than 1, if we don't think *any* of the categories appear in an image.\n", + "- `nll_loss`, as we saw, returns the value of just one activation: the single activation corresponding with the single label for an item. This doesn't make sense when we have multiple labels.\n", "\n", - "On the other hand, the `binary_cross_entropy` function, which is just `mnist_loss` along with `log`, provides just what we need, thanks to the magic of PyTorch's elementwise operations. Each activation will be compared to each target for each column, so we don't have to do anything to make this function work for multiple colums." + "On the other hand, the `binary_cross_entropy` function, which is just `mnist_loss` along with `log`, provides just what we need, thanks to the magic of PyTorch's elementwise operations. Each activation will be compared to each target for each column, so we don't have to do anything to make this function work for multiple columns." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> j: One of the things I really like about working with libraries like PyTorch, with broadcasting and elementwise operations, is that quite frequently I find I can write code that works equally well for a single item, or a batch of items, without changes. `binary_cross_entropy` is a great example of this. By using these operations, we don't have to write loops ourselves, and can rely on PyTorch to do the looping we need as appropriate for the rank of the tensors we're working with." + "> j: One of the things I really like about working with libraries like PyTorch, with broadcasting and elementwise operations, is that quite frequently I find I can write code that works equally well for a single item or a batch of items, without changes. `binary_cross_entropy` is a great example of this. By using these operations, we don't have to write loops ourselves, and can rely on PyTorch to do the looping we need as appropriate for the rank of the tensors we're working with." ] }, { @@ -1271,11 +992,11 @@ "source": [ "PyTorch already provides this function for us. In fact, it provides a number of versions, with rather confusing names!\n", "\n", - "`F.binary_cross_entropy`, and it's module equivalent `nn.BCELoss`, calculate cross entropy on a one-hot encoded target, but do not include the initial `sigmoid`. Normally for one-hot encoded targets you'll want `F.binary_cross_entropy_with_logits` (or `nn.BCEWithLogitsLoss`), which do both sigmoid and binary cross entropy in a single function, as in our example above.\n", + "`F.binary_cross_entropy` and its module equivalent `nn.BCELoss` calculate cross-entropy on a one-hot-encoded target, but do not include the initial `sigmoid`. Normally for one-hot-encoded targets you'll want `F.binary_cross_entropy_with_logits` (or `nn.BCEWithLogitsLoss`), which do both sigmoid and binary cross-entropy in a single function, as in the preceding example.\n", "\n", - "The equivalent for single-label datasets (like MNIST or Pets), where the target is encoded as a single integer, is `F.nll_loss` or `nn.NLLLoss` for the version without the initial softmax, and `F.cross_entropy` or `nn.CrossEntropyLoss` for the version with the initial softmax.\n", + "The equivalent for single-label datasets (like MNIST or the Pet dataset), where the target is encoded as a single integer, is `F.nll_loss` or `nn.NLLLoss` for the version without the initial softmax, and `F.cross_entropy` or `nn.CrossEntropyLoss` for the version with the initial softmax.\n", "\n", - "Since we have a one-hot encoded target, we will use `BCEWithLogitsLoss`." + "Since we have a one-hot-encoded target, we will use `BCEWithLogitsLoss`:" ] }, { @@ -1304,9 +1025,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We don't actually need to tell fastai to use this loss function (although we can if we want) since it will be automatically chosen for us. fastai knows that the `DataLoaders` have multiple category labels, so it will use `nn.BCEWithLogitsLoss` by default.\n", + "We don't actually need to tell fastai to use this loss function (although we can if we want) since it will be automatically chosen for us. fastai knows that the `DataLoaders` has multiple category labels, so it will use `nn.BCEWithLogitsLoss` by default.\n", "\n", - "One change compared to the last chapter is the metric we use: since we are in a multilabel problem, we can't use the accuracy function. Why is that? Well accuracy was comparing our outputs to our targets like so:\n", + "One change compared to the last chapter is the metric we use: because this is a multilabel problem, we can't use the accuracy function. Why is that? Well, accuracy was comparing our outputs to our targets like so:\n", "\n", "```python\n", "def accuracy(inp, targ, axis=-1):\n", @@ -1329,7 +1050,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If we pass `accuracy_multi` directly as a metric, it will use the default value for `threshold`, which is 0.5. We might want to adjust that default and create a new version of `accuracy_multi` that has a different default. To help with this, there is a function in python called `partial`. It allows us to *bind* a function with some arguments or keyword arguments, making a new version of that function that, whenever it is called, always includes those arguments. For instance, here is a simple function taking two arguments:" + "If we pass `accuracy_multi` directly as a metric, it will use the default value for `threshold`, which is 0.5. We might want to adjust that default and create a new version of `accuracy_multi` that has a different default. To help with this, there is a function in Python called `partial`. It allows us to *bind* a function with some arguments or keyword arguments, making a new version of that function that, whenever it is called, always includes those arguments. For instance, here is a simple function taking two arguments:" ] }, { @@ -1500,7 +1221,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Picking a threshold is important. If you pick a threshold that's too low, you'll often be failing to select correctly labelled objects. We can see this by changing our metric, and then calling `validate`, which returns the validation loss and metrics:" + "Picking a threshold is important. If you pick a threshold that's too low, you'll often be failing to select correctly labeled objects. We can see this by changing our metric, and then calling `validate`, which returns the validation loss and metrics:" ] }, { @@ -1538,7 +1259,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "If you pick a threshold that's too high, you'll often be selecting correctly labelled objects:" + "If you pick a threshold that's too high, you'll only be selecting the objects for which your model is very confident:" ] }, { @@ -1603,7 +1324,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "...and then we can call the metric directly. Note that by default `get_preds` applies the output activation function (sigmoid, in this case) for us, so we'll need to tell `accuracy_multi` to not apply it:" + "Then we can call the metric directly. Note that by default `get_preds` applies the output activation function (sigmoid, in this case) for us, so we'll need to tell `accuracy_multi` to not apply it:" ] }, { @@ -1661,9 +1382,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In this case, we're using the validation set to pick a hyperparameter (the threshold), which is the purpose of the validation set. But sometimes students have expressed their concern that we might be *overfitting* to the validation set, since we're trying lots of values to see which is the best. However, as you see in the plot, changing the threshold in this case results in a smooth curve, so we're clearly not picking some inappropriate outlier. This is a good example of where you have to be careful of the difference between theory (don't try lots of hyperparameter values or you might overfit the validation set) versus practice (if the relationship is smooth, then it's fine to do this).\n", + "In this case, we're using the validation set to pick a hyperparameter (the threshold), which is the purpose of the validation set. Sometimes students have expressed their concern that we might be *overfitting* to the validation set, since we're trying lots of values to see which is the best. However, as you see in the plot, changing the threshold in this case results in a smooth curve, so we're clearly not picking some inappropriate outlier. This is a good example of where you have to be careful of the difference between theory (don't try lots of hyperparameter values or you might overfit the validation set) versus practice (if the relationship is smooth, then it's fine to do this).\n", "\n", - "This concludes the part of this chapter dedicated to multi-label classification. Let's have a look at a regression problem now." + "This concludes the part of this chapter dedicated to multi-label classification. Next, we'll take a look at a regression problem." ] }, { @@ -1679,25 +1400,25 @@ "source": [ "It's easy to think of deep learning models as being classified into domains, like *computer vision*, *NLP*, and so forth. And indeed, that's how fastai classifies its applications—largely because that's how most people are used to thinking of things.\n", "\n", - "But really, that's hiding a more interesting and deeper perspective. A model is defined by its independent and dependent variables, along with its loss function. That means that there's really a far wider array of models than just the simple domain based split. Perhaps we have an independent variable that's an image, and a dependent that's text (e.g. generating a caption from an image); or perhaps we have an independent variable that's text, and dependent that's an image (e.g. generating an image from a caption—which is actually possible for deep learning to do!); or perhaps we've got images, texts, and tabular data as independent variables, and we're trying to predict product purchases; …the possibilities really are endless.\n", + "But really, that's hiding a more interesting and deeper perspective. A model is defined by its independent and dependent variables, along with its loss function. That means that there's really a far wider array of models than just the simple domain-based split. Perhaps we have an independent variable that's an image, and a dependent that's text (e.g., generating a caption from an image); or perhaps we have an independent variable that's text and dependent that's an image (e.g., generating an image from a caption—which is actually possible for deep learning to do!); or perhaps we've got images, texts, and tabular data as independent variables, and we're trying to predict product purchases... the possibilities really are endless.\n", "\n", - "To be able to move beyond fixed applications, to crafting your own novel solutions to novel problems, it helps to really understand the data blocks API (and maybe also the mid-tier API, which we'll see later in the book). As an example, let's consider the problem of *image regression*. This refers to learning from a dataset where the independent variable is an image, and the dependent variable is one or more floats. Often we see people treat image regression as a whole separate application—but as you'll see here we can treat it as just another CNN on top of the data block API.\n", + "To be able to move beyond fixed applications, to crafting your own novel solutions to novel problems, it helps to really understand the data block API (and maybe also the mid-tier API, which we'll see later in the book). As an example, let's consider the problem of *image regression*. This refers to learning from a dataset where the independent variable is an image, and the dependent variable is one or more floats. Often we see people treat image regression as a whole separate application—but as you'll see here, we can treat it as just another CNN on top of the data block API.\n", "\n", - "We're going to jump straight to a somewhat tricky variant of image regression, because we know you're ready for it! We're going to do a *key point* model. A *key point* refers to a specific location represented in an image—in this case, we'll be looking for the center of the person's face in each image. That means we'll actually be predicting *two* values for each image: the row and column of the face center. " + "We're going to jump straight to a somewhat tricky variant of image regression, because we know you're ready for it! We're going to do a key point model. A *key point* refers to a specific location represented in an image—in this case, we'll use images of people and we'll be looking for the center of the person's face in each image. That means we'll actually be predicting *two* values for each image: the row and column of the face center. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Assemble the data" + "### Assemble the Data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We will use the [Biwi Kinect Head Pose Dataset](https://icu.ee.ethz.ch/research/datsets.html) for this section. First thing first, let's begin by downloading the dataset as usual." + "We will use the [Biwi Kinect Head Pose dataset](https://icu.ee.ethz.ch/research/datsets.html) for this section. We'll begin by downloading the dataset as usual:" ] }, { @@ -1750,7 +1471,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There are 24 directories numbered from 01 to 24 (they correspond to the different persons photographed) and a corresponding .obj file (we won't need them here). We'll take a look inside one of these directories:" + "There are 24 directories numbered from 01 to 24 (they correspond to the different people photographed), and a corresponding *.obj* file for each (we won't need them here). Let's take a look inside one of these directories:" ] }, { @@ -1777,7 +1498,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Inside the subdirectories, we have different frames, each of them come with an image (`\\_rgb.jpg`) and a pose file (`\\_pose.txt`). We can easily get all the image files recursively with `get_image_files`, then write a function that convert an image filename to its associated pose file." + "Inside the subdirectories, we have different frames, each of them come with an image (*\\_rgb.jpg*) and a pose file (*\\_pose.txt*). We can easily get all the image files recursively with `get_image_files`, then write a function that converts an image filename to its associated pose file:" ] }, { @@ -1806,7 +1527,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can have a look at our first image:" + "Let's take a look at our first image:" ] }, { @@ -1855,7 +1576,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The Biwi dataset web site explains the format of the pose text file associated with each image, which shows the location of the center of the head. The details of this aren't important for our purposes, so we'll just show the function we use to extract the head center point:" + "The Biwi dataset website used to explain the format of the pose text file associated with each image, which shows the location of the center of the head. The details of this aren't important for our purposes, so we'll just show the function we use to extract the head center point:" ] }, { @@ -1905,9 +1626,9 @@ "source": [ "We can pass this function to `DataBlock` as `get_y`, since it is responsible for labeling each item. We'll resize the images to half their input size, just to speed up training a bit.\n", "\n", - "One important point to note is that we should not just use a random splitter. The reason for this is that the same person appears in multiple images in this dataset — but we want to ensure that our model can generalise to people that it hasn't seen yet. Each folder in the dataset contains the images for one person. Therefore, we can create a splitter function which returns true for just one person, resulting in a validation set containing just that person's images.\n", + "One important point to note is that we should not just use a random splitter. The reason for this is that the same people appears in multiple images in this dataset, but we want to ensure that our model can generalize to people that it hasn't seen yet. Each folder in the dataset contains the images for one person. Therefore, we can create a splitter function that returns true for just one person, resulting in a validation set containing just that person's images.\n", "\n", - "The only other difference to previous data block examples is that the second block is a `PointBlock`. This is necessary so that fastai knows that the labels represent coordinates; that way, it knows that when doing data augmentation, it should do the same augmentation to these coordinates as it does to the images." + "The only other difference tfrom the previous data block examples is that the second block is a `PointBlock`. This is necessary so that fastai knows that the labels represent coordinates; that way, it knows that when doing data augmentation, it should do the same augmentation to these coordinates as it does to the images:" ] }, { @@ -1930,14 +1651,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "> important: We're not aware of other libraries (except for fastai) that automatically and correctly apply data augmentation to coordinates. So if you're working with another library, you may need to disable data augmentation for these kinds of problems." + "> important: Points and Data Augmentation: We're not aware of other libraries (except for fastai) that automatically and correctly apply data augmentation to coordinates. So, if you're working with another library, you may need to disable data augmentation for these kinds of problems." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Before doing any modeling, we should look at our data to confirm it seems OK:" + "Before doing any modeling, we should look at our data to confirm it seems okay:" ] }, { @@ -1967,7 +1688,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "That's looking good! As well as looking at the batch visually, it's a good idea to also look at the underlying tensors (especially as a student, it will help clarify your understanding of what your model is really seeing)." + "That's looking good! As well as looking at the batch visually, it's a good idea to also look at the underlying tensors (especially as a student; it will help clarify your understanding of what your model is really seeing):" ] }, { @@ -1995,34 +1716,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Make sure that you understand *why* these are the shapes for our mini-batches.\n", - "\n", - "Tip: A colour picture is a rank-3 tensor. The first axis contains the channels: red, green, and blue:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAACxCAYAAAAs/X9SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOy9aYyl2Xnf9zvnvOvdb93aunrfe3qmZx8OV200tUCkvEiGlThOAtkxYATxt8RAAgSMkQ/5EORDAAcBEkQxYCtELDuOFFsiFVGkSJEUyeEMZ3p6eqmu7uquru3Wrbu/+3tOPpxbtxRAHMCCZA+IehqN6r7Lu5z3vff//P/P/3lKGGM4iZM4iZM4iZM4iR/PkP+uD+AkTuIkTuIkTuIk/uLiBOhP4iRO4iRO4iR+jOME6E/iJE7iJE7iJH6M4wToT+IkTuIkTuIkfozjBOhP4iRO4iRO4iR+jOME6E/iJE7iJE7iJH6Mw/mwJ4v/6m8a0+8jrl2DwQAaDXAcRBBSfvd7CEchfA88DzodcF3Ep38O83v/EnwfNjfBdaFet3/HYyhLWF6G/X1M7xBRr2GmU4TrwsWL9vksgzCEOIYognbbPh7H9ifYx3o9u02wx1Cp2P0kCRhjX+84sLCAvn0HWQns61wX8tw+d3R8SQLTKbTbiDPnMQ/vIzodzOYm4spVcBzMg/vg+4jzF6HZhr1tu43zV6DWhCxBnL2O2byDvPIyev0dePoYrr+IaC5ivv7bmGdbICUoBUVxfH6+D1tbIMTxMR0cwOoqorOI2du1x1wUoLV9Xa2GWDuDGQ9hY8Nu03Xtc0kCtZp9bVna9507hwhCzNZTzMEBolq1x3/qFDx6ZNev1QKtKe+toxpV+/6lJbtuaWqPt163+9rZQU9jvP/1K+Iv+D79M8dvbvyGyXVOw2ugTYkUCkcoDIZufABA6ISETogrHaRQnK+d59F4A4BJPrWvUQGu8kjLFCUkgQpIy5RxPkEKSakLlHRo+y1ynSOFQglJoUu0Kal7dXKdk5Yp2mhKo6k4IZN8wlGLqxACT3qEToVC5xSmQBuNEg6udOinA7TRSCEJlE+uczzlURqNJz0cqYiKGFc6VJwKWZnhSJdhNqDptQAYZSMMmrbfxpEOaZkikKxWVilNSaACQqfKQbLPUrDCIDukG++zFC7jS5+t6VO68QGhE5CVGVIoDBpXutTcGnvRPo5UONLFGENSJtTdGp7yyMqMpEzwpD1mIQRKSBpeg0KXdOMuAIETAFDoHE96CCEodElpCppeE1/5DNIBcZmghMSTHkoq4iKer5XBMEiHhLNtSSHxlY82mkKXeMrFkx79dMAkn/C3rv3tj+w9DPBrX3rPxFnBjdUaO6OMVugQOJLAlXz93gGhZ7/KHSU4u1DBcwR/9cYq/+TdbTwleLA7xnMU9dBlseaxO0wQQnB2IeBJL2YYZTQrHqM4RwrBzdN1itKQloaqJ0lyTW+Ssdr0kULQnWRIITAGluouu6OMw3FKmpc0qx4V32Gh4pIUmqI0ZKXGkYLFmsv3N/qEnsJRklrgkJUaT0lKbTAGKr7DOMk50w65uhjwzvaU002PD3amXF+tMs1KupMcRwpeOFUhcCRbg4xWqLi5WMeVkrrv4DmSvUlCxXEYZTnr/Smnaj4Vx+H+4YR3t6c4SuBISVqU7A8TTi9UaFccHuxOkFLgOpLlRsBWL2KpEZCVGt+R5KUmLzRgP7eN0K5/nGl64wQAz1UErmIc55TaUA/tZyIrNEoJ6oFLVmi2DyMcJWhWPXb7MYGrWGmFeI7EcyS3n/RpVjwKbTi3WKXqKYZxwSTNWW0E5Nrw5GDK/mHEt/6Ln/hT7+MPBXomE4SUiGoNkyQWgMsSE0UUwxjvtAV3jLFAohQYDcMhVKuYokRIeQw8oxEEAUwmkGWIMICiQPj+MbBPJhZwRiPMs23E8pLd7nBo91OWFhiTxO5z9lpc1wKbMRYQwW5TKUgSTDFLEI7AXwh7Pp6HqDegs4jZfmZBOI1BCExR2O2evYR569sWjFstzGiAiCbw6qcQQRWxsIrJU3vRXR/O3cBMBlBpwLlLcLgHy2cRH/9LkEzh8T17LL19TBwjllYx/QO7NnEMCwt2v54Hw6Fd+1rNJiKua89Jyhnw5zYJAwvEQWDXSEr7frDrI4RdQ8exyVq1ap8PAvu479u1TFNwXaQ7SxrS1K5Tktj9BYF9rNUCKRHqoy0KFbpACkmoAqbFFEcocl2Q6YxxNqXqhnjSpdA5EkHghsRlRFbmlDOgFQhKo3GModQFSIekTDDGIGZ/SqNpuXWSIiHXOTW3xjSPyHVOxQnnIGeMQUlFxakSFdM5cGdlPgMjQTFLCJRUKOEghCDXBaUpEdjPca5zlHTIdY4jHKpuBWMMucxniURJru3nwFc+FSekl/RwpUPgBPN9nqudRyCoOjVyk5OWCUkRAbA+vEfVrVJ36/yg+wNe6rzE6eoZViun2I/38KRNfEqjaXoNDtPDGZiWVJ0KhbHgnJYpBrtWStivHCHseQgE0zyaJWGS0pSUurDbESWFKfCEhzYlSjgUpsTDoDE4QuFKF0e6ZDpDCYeKE9rkRQg85aGNQRtN4LhoY7+Y1ezcxUzQ9JT3b/We/LPEOLbXcpyWKGnXbpiWJIUmKzRV3wKkMTBJS9rKISlKuqOEZsWeX1FqAleRFJreOKVZ9ehHBYW24BOlBY4UhJ7DJC0ZRjmtqkd3nLPZnXBuqYqSgv1xhjaQFyWn2wFxZte1WXHZH5bkhaZSV0gBSV5SlIZ2xSXXhlFSUmiNNpI4KwhcNT+/auBy63SVwJF8sBcjBdzrxpTaEOUaz5VcX6zwrScjlLBJw/4kp+opfuHKElIIqoHDzihmZ5JgMKSl5t7hhKQw+ErwG2/t8O+9dopbyw1udOrsRTFZaRgkBadbPhfaPhuHKRXfYRhlnOtUcJUgLzX9aUo9dGkEit1hQakNSgpcJckKQ5wVgL23i1ITeoq80BhjWG2FFFozjnOkFASuohE47I1SlpsBSV7SDD2qvkOSlTQrLlFaUnEloefgKAlo8lKzNyootMZ3FOO0YLnuIYB69Uffxx8O9FKipzGq1oCDLqLRwAwGkGU4zdC+5oh9C2EBRpeIc+fg1uvw5d+ygBPHVg0AywjL0oKK1sfPVSqWwbruHOxFu2XB5elTC+qt1jHjHQ4tm6/X0WmOrNctIz51yr7nKBno9Y7fJ6UFtCMmH0WI8xcgrGJ2to5VC7CvAcQLL8GWZbvi0z8HP/gm5Bl87GeQZ6/Z12oNZQHJFJNnkCWY/h6itYzpbsPOU0y1AfW23eaNVzG6hDxDNhYwkyHi9ncxR+x7OrU/lf0QiOVlAEy/b8/Nde1+XRczGh4zfLDvrVYtGE+nln1nmT0fOQPlJLHb1to+127b1zqOfWw4RHQW7P+VsttqNI73E4Z2XX0fcfHih95C/67DlS5pmeIpj7i0LOYIZFp+AyUVmc6pOhWUVCRljCuXaHgN1qpr3Dl8H7AfXm1KDIaszHClixCCwAnQpiR0mpSmoDQaX/lM8glKOjPmbROLXBe40iEvcgIVUuiSXBd40l5PKQRJkVD36iipcISDFJJhNiR0QnKdE6gAT3l40sJdUhQ0/AaudJnkE8bZhLpXQwmFkvb9Hb/DKBshheJ66zqPxhtkZcalxmWqbgNjNEkZMcyGjLIhAHER83SyxfMLN+klPX57/Y+Ji4TLzYskRcK5+nkAam6d0AkptU0slHDIdDZXLo6A/Yh16zxCCglGo7FKhhJyvrYW7PX8+skZmCelZUmFzoEQY4xVBUyBg0uoAib5hHKWOOW6IFQzZcDMvoARFKZECYkjFKUpcKXLQtD+i7wF/1yi4jvsD2OuL4Xc68a0QofdccYo0VR8BynFnCnWfAvmADdWa/zS9RX+h288YhRnDKKMlWZguVdWMhI5gas4u1hlFOU0Kx5LdZcnvRglBQejlGrgsNIKSbKSh/tT0lyz3AzQWiAFTLOSg1HCQs1HCFBSsNWLuLRcJS80nZpHnM+Si4qHEhYcmxULYFIK8kLz+rk67dDl+1tjDicphXa5uhRSGpACPn6+wR9tDgldyV9/bo1vbB2SFYafOtehFtgkpz/N2JkmPBkk1H1FqQ3vbk9Za/kcTHLubfT4F65kqREQOJJLHZ9W4HKhWWGcFaxWAko9sueV+gzignrgUA3sfXyuHVBqg5SCojSU2t7DaVHSqnpEaQkUaGOYJgWNikuUFgyijMW6T3eYEHgKV0mUhLzUaG1wlCTJSxZrHo8mGUqA70qe9mNW2yHGwOEkJclKOnWfg1GKo2zCsDvKqIcut842fuT98+FAv7CAPAIfz4OwYoHaGOT5s/N/I6UFlOHQMtaFJUR7BZPnFliPwKJahenUSvUAnocejpGdjgWwZtPutywRyyuYjYd220mCyQuEO7XbOpLvlQLPQ3batnQwGiGWV60UKoTdd1GAEKiFpj3WILDHMRjY56XC7O8cs14AbSwYpim89mnM//vbiDc/gzxzFf39r8PLn0CevYaZ9CGa2ETh8R3o92asfxE2H2J6+7D3DHP/PkKqGZt2Me0ONBYQzUUr97sefPxzEE+gtwPVBmzctccVRVCpYfq9YwAPQ3s9lLLHfXQNajUr5zebmM3NY4YehvZ8sgwznR5fDymPWXsc2zV0HPu+orDvAQvwcFxKyTL73qMywkc4XGlv8UIXOMKCZ4llnEdSvTGGwhQE0gJDaUrqbm3Ong3GssAZa9TGsn1HzOTpIsEog5KWWRe6pOJWcYRikk8RQpCVVh0wxuDOFIS0TJFCYjBz1i2EIFCBVQuEQBuNNoastMqAQMzPySYOLkoqoiIiK7O5PA2QlimxKekEHaZ5xNnaGSpODYHkdG2Nqtvgg/577EZ7rFZW+M7OW6SlVTquts/zbnedjcEW3WjE/We7rNVqvL13H2MMZxtLXGtfoupWcKXLSrjKSrhKprOZXF9nO3pGlEdERTxLVAoL9kYTOPYcNWYuwTtCIYUicAIcoWbrLdEYPOkRFTEGl6zM8JRNjjKd4StNYTSFsWUSy+ZdpFAkRYIrXZuEKJ80T8mMxlMeSkgMmqzM/23flv/GsVjzGEUZe5OcSVpyYzlknJYIoTm3WCXOSpv8aEMjUPSigv0o4XInJMpKJklOmmsaoWAY5XTqAdMkZxBlhJ6iUfE4GFuw3h1lLNQ8PCUZJQVXlkLe3RoTeIrxMGec5NQCh9IY8tKQlxopBJ4jaVY9zrZDupOMl9dqvP1sYoVYbShKTaE17ZqPNgbPVbRCh+44Q2tbJnh3Z0qcaaQ8ktQNUVoSZyWvrzX4xmTA33njLO2qR91XXFmpUvUdvvn0gPv7CaeaHm9tDpFS8Gh7xNUzLZ4eTHi0N0Zrw2SSst2LeLg1xBj4YKlKLXBYqPmca/uUxnC2XuFCs0JSaNbqIT/Y7eNKwTQrqfuKXpTTrrh0xxntqkvVU+yNM7Z6Ee2ahxCwUPOp+A6rdY9xnNOqeEySglrgEmcF48SuoxLCqlNHKk1slQJXCaQQ1HwXJQXDOMNzJNlMwUnyknGSUw9c3FkpYX/8o+/jDwf60QjOnYPxEBGGMJ0cs/HtbQsMRWFl+FrdMtJ+z0rd3/qKBZEsswClFIShrct77lxKlosLx9Jxo2Efn04xvQP7niAApSg3t3HC3AJcr2dZv+vCdGqThcVFxLkLsLIGD+7B4iLs7dl9G2O3XRSICxcx+3uWwfq+fS7L7PlqbUEPEKfPwJkLiMXT8IVfRZ65DlIi/+rfxSRT0Brz/T+wQL9y2q7NoAdnLkBvD9PdRYz6EFYRN56DPMNsrFsgHQ4xSYpYWUZ86qcgqCKaiwjHg3M3bDIglYWZJLYKglIWyCcTe9xHcv7hoVUsytLW0osCM51iJlNEo2HXIc8xDx8isuz/z/6NscdTFHZdtT6W+6W0f80s6RHCJmK93nHJpFY7LpN8RCMtU9rBArkuqHt1W8NWVgrWs9p2ZrJZbdwy00Ln1L0Gm+PHlv3PpkTX3BqBCtiP9yl0OWeIcgZQxlhA8pWVoxOd4EiFQOA4FUbZyDJ45ZHpbF6jPjpOT7k03SaudBmkg7kyEDoBjnTnHoOKE5IUCcWMQWujyXUxSwpKSmOvbydYoOm1qLuN2c8WGs2V5tWZZD7iXz36Ko8Hh9xcXCPXOTuTIZ85e4vN4Q53ul0utQtuLV3k6sJpBsmYzeEu24MR37z3kCT+Q2r1kL/5+idpeI9Zq67SCTs4wqHltTldOcNB0rX1egzalHjSY1pE85r70XkdrcOc3QtJWmYIJDW3hic9kjJFCpsolNoCuiOs6pXrYn49KiqkNHq2PlYhsN4FF0/ZuvxRcnXkAfiox+E04/JKjTjXXOwEPO6n5KWh6SsejlIWG/5cfl+oOOTaEBclZ2oV/vuvPyT0HQJP4SjLplcaDu9s2npwLXDpzRj5JMmpBRa8hLAf/51hZpm3ECzUfQbTjCgtaNV8tgcp0yTHcySDaWZr/aHHrVNVtDEcTjOW6j6Hk5Rq4OIqSafuE2clV5dCdkYZhxMridd9yc7QIARobazM3fK4tBhwc7HGlU6Nf/BTNTxHUpSGV5ZbeI7kMMr452/t0j2IeO7SAsbAnYeHfOKFVbqjhO3tMVcvLbDcDPjk9UUOpwXvbhywvT1i60mPeBJTqVf47Kcvc2dnyq3TNYqZN2G54vPycgshBkzSkqzU5KWh4h2XLKdZScVTHIxs3b4oDa4jMQaivGQc5zhScnmpQlpqHuxO8B37faGUZDq2CsBS6JKXGiUF3UlOK3RwlPVBeEoiETzeG9Op+zQqLg+eDQk9RUU51BoBrvzR9/GHA70QiBu3YHN9xlo3bP14d9dK5Ds7sLpqwXRhCeEFkKVQa1mgznPLJodDxK0XMTvbFuSPGOWs5k2eW6YIsL9v1QHXRdy4CdMJ5uG6ZdlheFybLsu5DC+roTX3uS6i3ztmmdWqBaMss0lBkmAmYxiPMVmOEAIzGljQH4/t9oIAihzxU38ZUalDpYHonLK1cKMxe5tQb2P2HmPufwD9PjzbsseWpohbb8D92xDHFh8GA8TFK7CwCNMJ+t4DilGMjjOCwMd88C6i2cYYDV4AV55Hnr2OWLuM1iWi1rYJxXh8rKAMBhaQazUr3+/v20Ss27XXJY7t60ajuQIifN+uWb9vH1tYsOsSRRAE6P4Q2Wohlk9hjsoiZrbmYF8n5XGZ5ciYd6TCfERDCkXbb7Ez3cVTLlE+pRN06CU9Wn6LUTYiUOHcYGekoTRWMcq0BRpHStIypRN06Kd9W6+fSf5KKGpelay0oFKYgnE6mbFzl9PV0/TTQw7iAwpT0nCrCGzN3R6ftCqDSYhnYFR1q3OzWuhUMMxMP0iSIiFQPlERz8xwJUmRUOqCwhQYDFWnghSSy41raFPiqxCMYZQPmBZTHgzuc6p6ip3pDt/b3mI8jtgejbm1ssyz8ZiqU2VjsM9qrUY3ivi9R++xWqvx0vIl7h/uWxUjyRgMJ1SqAd94ep/L7TZv792n5vl85szHaHhNqk6NiltlMVjiyWSTwpTkZUapC1JjcKSan1dUxHMWHwhr9DNGU5pibliUQthzRpDrnNCpgPTm/orSlDhC0fJa9NM+hS5njN9HYg19Bk3drc/vj8KUCPPRB/pSG26tVvjy3UNeOlNnd5Rxvh2w3o24uBTysBux2ghohQGn6wGhK0kKTSNw8V1FlBZIISi14exCwGYvwXMkriMZxzmuI1mo+XPAvrIY8M7WmFbVw3MEv/riKre7Y763OaQoNNXAqlmFtnXjo/+HnsPhJGXLkzjK7s9zBK2qRyNwGMQFNV8xiDJ2Rhn9aY7WhjgteNLPGCYFg1lS4LuSvDT89RfWUNIy3EIbnh5G9NOMO90pZ5sejw5T9vanjEYJ33t3h4sX2hSFrYdvbA25fKHNzv6Ep9sjHEdyeqXGcJjiug5FXlCWJWE14L31A1aWquz2IzqNgIuLFSaLBaFSXG3VCRzJd3f6jBIL3nFWks98D7XAoeI77PRjGhWX/WHClZUao6S0trFSM0gKtLFlwLzQjGJLMM90qiR5ySQpcJRgHOe0ax5vnm3w++t9hlFGkpUs1H2aNasMAJxdrtGp+SR5SZKXMPM7/Gnx4UBfqcDyWczv/BacPXtsdDuqE/u+ZZhBANtPrPv82SZkMwZ47hw8eoSJE9jfs+xTSgsejoO4dAWSGLP52ALIkVxerSJ++vNW/n/vW/DgPk6rYvdZFLC0hO71kbUZ4zSG/Mke7tISTMd237u7iIuXMHmGaLagUoPxELP11CoQ9Zo9BteFqzfsuexsweufQbSWLZOXM9NbNEbvPAQhYeuhZdnLpxCdDtG33kUnj6h+9jWKB48RD/8ROsrwfu5nMJuPrFv/zns26ShL8oMxwlEEl1Ytq9/dxaSpPZabL8Po0CoGkwHy/E1MlsDF64jmgi0DjMfQ7yNWV+37PA8RW/Pg3HDneRTjBLdRt0lPv2/X7k/W6BcW7P9nJQu5uIB44WXMB+/ZJOJPyvOz2r3eP7AKzGhkEzyl7Pp/hKPpW8BJygSnsPVeKwEbBHLuGtcIpvmUtr/AOB+hhMJXPnXXYz/uUuiccTYmKRK00YQqIHRCqk6VwhTkuqDiVsjKnLiIqLoVnms+h5oxzp3pLgJb50coPGVr6o5QCCkwxjAtIqpuzRoAhWCST1jwFyhMgSOsk9xTLsOZc95XPmC/OBbDRTzpsRfvc6V5DUe6KOHgSZ9cZwyzPm8fvE2uczYGT/nGs+/z2srz3Fpe5kvv/zFxmuG9qni6tc/fv/0/E/ge/+AX/wrffnaXtCy5s7fP+7t7tKsVkjil0ajyxo1LHEQRcZ6zO5ngSsnrp26wPd3hQv0CTydPOF+7gCNdOsEipSkZ5yOMMQzSAcGshg5Q6gJHKVu/xyZZg2zESriEQBCVMUpYVqkxGI4Thbn5TrqcqZ3l6eTJvOwy3/ZMOZgWERUnRAnHrr0Q83X8KEctcNFAlBZsHqYoIRinJVobLrR9HnYjDqcZKw2fO90pL67UWD+MOIxTGhWXBi7dYcIoKuhNC/LSMs+FmjV7vn6uzjgtubMz5cZqlSeDlGJWf/6PXz2LMYZRlvOtoo/vK0JPkRWas+2QHwwTtElpVz2MMTw7mHK6HTJJNcbARjfipTN1plnJWqNCp+JQ8xT39qys36i4OFLiSsGtU1WWqi3ud2P+2s1VSm3mprdxUrA1jPiXd7oIAQejhC/3Y968tkirFbBxd4tiOmE8XMFow2/efgTA+c+/yupSlYN+zM72iI37O5w6u8je1j7nr6zx3M1VHj3qk+clkyijVQ9oVz2SXNNPMva1ZrUSIoXgxaUmy1WPnXHK1jBjb5hyrm2d75NE4bv2/pVCMEqsyjSOMlo1j7w09MapVSwwdOoB4zinEShcJdjsTqkFLq2qx688v8L/c//ACqpZiaMkWW7XU2szNwZ2RwmrrZCmryg+5Lv4Qy3T4rOfh8H+sTO70bBsXkoLBEfMfCYXIxWsrGH++A+sI/v0echzxNKiBYyimAOzuHwNlMI82bSgIaVlqouLyM//KnzwNuaD7yHOXbMO+LywCUJRwHhsW+V8H6TEpBnuom35MpOxVQnW1uDcJcTCEqQJ5tkT66p3XXs+Z85YRuwFUG8iP/4L8MnPIS+/gmguWUPdsAujA/Q3fxtGh/D4LhQ55v4d2HpsmbTnUKQ5yR/fphhMyfdHZPsjzOMNxBufJPvhPcRLrxJ99S2K9U10kuN2auT7A4pRbIF4fx/x6iesRJ9nmHtvYcZ9zLiPcD3k1Vdh+bRNVmateUZrC8BHxrqytNdn1sHgrrQtYI9G9mK67nHr3RGbP3LdzxIA+dpnj9WVSgXxt/4eACaxiYi8cA5WVuz7u13Y2ED4wY+6fT4Scal+iVE+xJu1e1XcysyYNjvnGbvWpmSSTyhMTuiErA/XZyxSoo2m5bcpjSabOdk1hqpjmfc4mxCqgFLbbSyFS1xpXOXhaJ1xPqLm1Cxzl87MD1CSlTmutHXkrMwojDXqOUKRlfncUe4rH4FVFOIiISmSeS274lYJHJtweNLjdPUclxuXqThVHOEwyYc8HN1jfXSPX7/zG2Rlxt2ebRvcGY/59va73Ol2qddCOq06H9zdZDSKcF2H7uGQf/z2N3l55SJvfbDBL998jfdub/Dg8Tb7+32ajSrvPd5i4/EOg0nE+v4Bnz77AsN0wv70kK88+X12prs8Htsv26bXIlQh2ug5MMOx+96RLqXRVl2RDoEKaXkNpFBzA52vfAIVoISk5bfmLn4l5CwJUCwGS3MzX2kKXuy8OPc6hE6Ftt+i7tapuVVyXTDJp/Nk7KMcf/u1M6wfJDhSEGcF7YrDk55VQR/3U8rSUJTWnT7NrCK11vD5P35oE8yDkQXutYUKaWElZt+1ic615QqlNnz7YZ8z7YBRUrLdjzm3EPKffeI8/90frPPgYMKlVo1q4CKFZddJXrI3Sgk9K0O7jqTQhoWGT+BKorykU/e4vlLlXNOn6lnvwHu7EZuHMY3QI/QUN0/VWWn6rDZcrrarfOxMh5+/vITn2MRuf5zyva1Dvr7Z5R9987F1/g8TCm0wxvDVt7cZDhNc32Xt+kWGT7eIxhFu4MOwy1f/4AOGk4x772/xxstr6FKz8cFTPN8jy0ref2+bfnfAdJqxuzvBdxVPDqbcfjrgSz/Y4RuPRjweT8lnbYCXWzW0ObJGCQpj0Ma67oUQaG1d9lIImqFDJXCJU9sBUWqrenhKUpSaM50Kh9OC0JVUAmuq9BzJxRnLH8UZgaf4L3/mCoXWFKXGdSRXVutcX6mx2gp5vD/h9rMRNe9H38fqi1/84o++u6LBF1l/H1GtWOnacSwYB4GVwbsHiIWFY/C++YqtWW9uQKOBiKfHju3JBBNFCMeBtTVEowmRNYaJdscy/moV+Yt/A/O9r2Fuv4tYXAIlYNBDHBnDjuT16dSay4YjTKkR9RpicdECTxzZ+nwyRejSJiFHpYFZ/7yoVhEvvQHnryFf/klAINorFgjTGMoc8/53ME/uQ60B+zuYJ9skVSoAACAASURBVBtwsG/PaX8X8ZnP4oSC8M2XyO+sc/B4wPAgYmtrTHM6wnUznBefxzx5THz3CU8/6LL6sYvE93bxVhpED/YoeyPci2vQP4A770GvC6MBorOMCKoQTxH1tjXo7W8hHAWOQlx7Hp48PpbpK5Xj0kYUWaXlKI6MenluE4tm87jVTinbffCzX0B/+TdtDb5SQZw7D/tb0DtAeB40GpQPNtBbu8h2A72zh6iEEAbIT/zCf/Nn/wr7i439ZOeLw3RI3atTYmu7ts3OtsjFRTzv2/aVT9tfYFpMmRYWAJIymTPPpIwpTYkSik6wQOAE5DrHlQ6+4zPNp9S9OufrF7g/vM8kH1tZX2cWzHWBIx0cqVBCkZuCXOfzEoCvPKpuFSWVTTp0gcagZswTrKnQYMsLoQroBB06wSKrlTMYNBVVpTQle/EOg6zPV558jfd792n5dQ7iPu/uP6MbjdHG8PDwkP/8zV8mqBd8+tpz3H62xdbuAdM4Ie5HTJKEp1GfN65eYGPQZTiJefB4m/NnV9h61uX0mSX29wc4SnFupcOTUZe3dp6w3uuiyTlVW8B3fIbZgAW/Q+hUyLWVKw2Gtt9mnE1IZmY8IZgxcXt+R2Y8ZgzeVwGZzjEwnwFw1HJY6JLrreu823uX0tiWypbf4jA5pNAlgeNj0PO+eSltmcAYQ6B8LtSvfGTvYYDH3eiL6/2Iy0sVRqnGcyS9iXXEj5OSrYMpq23bP+9IweunWiRFyYNejOdINnZGnO5UaVc99gcxSV7iOYqry1UqniTXBqUUZ5o+jw8Tqr7Dr712hn/4ew8sc1/w8ZRke5ISZyUIQcVX+I4iSq2UfTBMUFLSqLicagY0fMXTfsreKCWdGcyiTDNKSrSBrCgBC4aLVZefOLfItZU6xhjqoUOUljzsT9iaxPzOnQPu701xlGSSFGztjen2IsrSMJlkvPzcMsurTRbaIZH2GT99jI4mYDQlip3tAWvnV0hyTbtTp7uxSV6UxHHGmfPL5LmmyAtWVpvsH0RM45xn2yOMhIVGgDGCbpxythHOlQvPg0LDJ862eHd3QpQWloHnJSutACkEvUk2e9wQePZz3Kx4c1OdoySlMbhS4ijJNC34O2+c5X/8o00G0wzPlbx6rsn64ZT9iQX95brP248OeXIYIaVgOM2oBi6OI/nZ64t/6n38odK9Gexj7n+AuHDJAsNohIkTxFEftxCWMTabiJVTtp1s/5ll3WmKOTyExUVEpYrpdm3v9uoqIqxANMHE0axentnXvfxx9D/7dZsYLC3BaACeb9lnp2MBamfHMtGZiUzUa4ilJWuOy3MLVFojrlzFdPdtYgAW/MIQlpcRzbY12l26hfArgLDM3hiYHGImA8zD27C3jYkmFkxnDvXyzj1Mrhne22Xxp38eUanCS5+gojWD//Y3GGUFS3WfB/cOOTP5LpVmSHBxkcYXPkVjOOT2P/0O1z5+FvXaqzQuX6S8t269ENUa0x9uULl1AXHjBcvsD3fACxDNDqLSRPz0L2NGPfi9f2YTqtJ+4Gi1MLt7FpCVgn6foj9F1QpEo267HI5aDms1a8xrt4+d9XEM1bpNYCoV2zb3+k+iv/Iv7NqdOQPb26hmbT7MRzZqdt9x/Of4dfbnH/30kKSMUaJKXES2b33W925b5OR8wE1D+VScCmmZzNu8sjKn7tXIdYGcgXHNtQw9LdP5YJvClCwEC1ScCu8cvEPVrRDMhsoETohBsxQukuucqLAytEDMlYXQCWds3rJ7T9r9REVMaQoKXRIXka3zew3qqs5SuEzb65CUEUkZ4cmA0pT00i4HSZfv7b7DBwfbFFqzN91kpVqlF8f0Doa02nW+9Z3b7L70E0zzjJ+/+AripwX/9T/9P0nHCaLiMBhNWN/YZn1jm5vPXeD5a+f4pTdf4n/6v3+fT7z5PM8vLaGk5MnmLkoIKo7Ds6f7XLt0mnONRTaH26Rlxmpl2Q7gCVc4V7tEWiasj+wsiUxnc2PcIBviSBdHKKaFdeuLmXciKxPKWclECMEgHVJ1KwgkmU5Jy5RUp/N+eykkS+EyT8ZPAGukHGYjfOXj/AlDIzBv3fsox53DEf2oYKXmMphmbHYneLP6uqMkSgkmaY7B4dpSSKFtDznAKMq5uNqww2lmju1OPWCx5lH1Jfe7MZ2qy3Y/RhtYa3q8vFrnH/7efVZaIa4S9KKCqptRGlhsBCzXXTZ7CbnQZEVJq+pTD13SWQIBcHc/wlWSK0shW4OMuq/QxjBJc3xHcWEx5GonZDH0Wa0FHETpfEZAWmgeDSasH8bc2Znw/noPz1MYA0oJBoNk9tVX44Pv3GZlpUaSFLxybRHv1ip/NJoy7XbBcSGNMXnGk9sPWDh/htVTLa69+SL3f/iQs5dWKYqSJEpIooRirU2j4fH+D59w+vwyrUbA+vYIIQStzOFdf8BzusHNlQbXdJ1pusv2NGZvYNsRF+o+e4OYrLAye2+cMJqkhKFLPXRJ8nLe3bDS8Pnu+gHnl+vzcx7HOVmpqfoOWhs8V/LxU23++d09AK6v1Li/P6VV86kGDhc6FbrDhNBTTNLyR94/H8rozcO3v8jmQzsEZ1bnFc5solsQzFvkbDvcKuLMJSvbH7W3HcnLo5EFmIMDRKcDaYIIqzAaztvgxI0XMd/5uk0IHMeyzIWO3Y/vIVzHGvW0tmz1qCZdFHb7M1lfrKzO+sNn7vlqDfqHx1PjXBf1K38PsbBin/cCzOGu/dl9CmWB2XoAW4/Rb30fef0m4sU37GCYfg/14kuomzfZ+/J38D94F+fiaUSWgOOw8rk3WDzYokwK7vUjblxbQijJ3vt7VN2Ssjdid+OQU88tI+oVdr/0NfqbfVo/+Qr0+7jLLcQrr0Nv3zr4kxiGhxg/gHiMqLURYRUTBHCwCw8f2PWNIsuug8AqLIMBGI1s1MHzKA+GSE9ZwN7ZgSy3a3rugl0bgO6slc/3ET/zeXjrG7C3i+kPELWqde6PJ3a2ge/bfTgOlCXys7/8kWVD29HTL0ZFRFqmZDqf924D+Mqj0MV8Il3VrdLy2myOH8+m0pW4ykEgZu8yxEUyc4An855xPZOca26VnWiHuIhnQ25yy9DFcYVskk/nxjmNmU9x00bjqwCDoeZWZ8BukyglFYWetYkKhRSCVxbfmA+n8VXIo/E6Na/O+uge42zM+uAh+1GPx4M+r506z39w8+e52FpGyJg3L17ltbVL/OsfvMXtvSdUGhUG6T5RnvBLr3+MoV+gC814a8DHPv0CwsD99S1c32Vj/4Bnj/dYPb1IqxLyW7//XQ72+7z03CXGWca5lQ7PLa4QFTGPBocYk7EX9WgGNZ5NnnKqsjbrTPDpZ33GmS0txUWMI128mTP+aAJhxbVmxKRMCZRvOyfyaO7i7wQLpLMkINfWYOtKjzO1s3TjLnEZExcRSipKXc5a+3xCJySbDfJxhMPV5o2P7D0M8Lv3Dr64M0wJXMXeKKUReriuJM1KGhULsI3QpSgN59o+K5WAbz4ZIATsDWNaNTtZzZECR0n2hzHnFis8G2QEM9ObNtAMXV48VeNbT0b0xilSSiZpwUrDQ0lohw4aeDZI0QYGUWZLAEBe6PmUu9zAhYWAcVqy2YsIPYdO1eHJYYLrSAJXEWWav/HiaWq+S5yVhK7i2TgmUIqHgwmTvOCH21N8VzHJSi6uNbh1vs355RpLnQq4kosrDW5//y7DROAHHloKDgcJZ893qC60GfQjdBpz5fUXKIXLZDBhPIoZDyPywy5Ovcnyco2ndzcxvS1ka4myhIuXlzi7WkdKwfbuBCPhcJJxph0SlTlN16VZcVkJfbYnMVuDFEdJorTAdSRKCpbrHr1JxjjKaNdtu93BOJ0lP1Ue7E9s54gS3FqrsT1IMcBhWjLJSiqewxduLPGNp4ds9xN2+jELNZ9CG6K0oF3zCRzJ3jBFzsyKf+XWyr85o+fRfcv4jkxcRy1WcWxb5QYDC/btNoQVTDSyw20WFhCdJczWEwvOS0tweGgBPI4ti1fKMtmlZdsn/q2vQRBQDmOcysxc1u5gHj+0AN3rHU978zwL9lLaWnwQIFZPYz64jdndsYN31tYs4z1qrcsyxLWbts6dRuBXENKazERYxexsgOthnj7E/K41H4ozp6HegNvfh7Vzdg2iCebxQy78wi27/8kEs/cO6QeP8a+fwzvV4vFb25wPffJhjApcmk0PpxGSPjsk1oYHX9/gSmnY78asnaqw9+v/mubpJsGnX7F1emPg+ouw8wROn0f4IVTqdkaB6yPPXkc7Ljy8b49pfx8uXICdHeuYz3PrYVhbg+EQVfPhwgWEUhjfR/7Mz2He/q4tx6Qp8j/8T9Ff+l8Qn/tF2H4KXoDJZka/Rt16JHqHdmrFzIBXDieoTms+1OejGr24R1KmM7lczsbFunNXu21Zc+f963ERMcxGuNKl7beIy4RM21GyWZnZQTSznvpC5zOG6OAql6eTLcvidY4oJd5sqM4wG6GNBW5PemQw76MPVUAjaCCQuNJhnI/pJT1r7nNCXOnOavnWcb8QLhCogGk+nvsH4jyi5bd55+At6m6djdEG/9sPvs5aq8Ekywgdn9+8/2XO1BdZP+zRCqbc2d3nr33uE6xUq7hK8bX1hxweDFnoNCnLku7hCNUJGfTHTKYxrutwfnWRd+8+BuA737vD+PqUIiuot2p86Xe+wYXTK7x56yoAxhh+9bmf5qtPvssrKzfRpmS5skwv7dJwm1TdBqcrgmE6RArFKBtRmQ0FGqRD0jLFkQ41t8okn+Irn6VwyU70c3JOV0/zcLROUtpJhC8vvsw7B+9wo32DdMbQj5IFX/m40mWc2XKWHUtsiIqYulefqyof5Xjn6cjWqyd2nKoQUPftLIFpUuA5ing2TMV3JEoIbm8N6dQDfvJahzt7EXFWUPN9xnGO7ypGSckozmlXQrLCcGutRt1X/Kv3DzjVCpgkBa6yzvyLrZA/2LATONPCtoA5UmJCl2FkpePlhk8jUFxcCPjGwyG/++4uzarHpZU607Sg1HB2IWAQl7x4qkLVVRyMLYu3znQ7r+IPn/QojeH29pQf3u9aeXqY0K777AwTeqOER5sDOp2Q3/7BHT75S59hpRWy3PD5J//XD9Fa44c+1bqdFkmesHH3KVJJiumEGy8+z+1vvw+6pHv3Lt3tDkRDqC2w/84PqF2+ge87xHHB6lKVX3jzLN9dP+DyaoPuNCdwJZujCGOgFji8sbrA/X17zz3qTgk8RW+ccjBK2D+MqFTsGowTOwnw6modJQW+o/j7n7zI//7OM8azWQF/981z/PpbW/wnb5zlyTjCnWGv50pW2yFCwE4/wncVjrTlrijJaVTcD71/PvQON73usYlOCHR31kMdhrNRqxoTzcbWnrlo64izATeksa2jG2Pd9kdT18bj43qxUnDxGmZzY27UkxXP/tt1j+fbj0ZQryMuXYU0JX+waQ+w2bQqQ1iBevN4Yl6rdTz7vtFCvPIm8gv/PvIzfxl561MQVO0kO2HrgUgHU2SYO98DoxGXLtnhO+cv2vJBuwNPN2wL32SM2XpmvQWvvjFrLVzAv3kRFhdRz13j5ReWSQvNBw8HCCU5PEzIuiN6Gz00hkfjmGdvPWUrTslyze5exHhnZFvyHm+AH8Kob/vz776L2VrHbN7FDLqY4QEoB7l2BfGXfskaIdMMnj2b97vrND8e/zudYrLCjtk9PLTmyOc/bhOUx4+g2cT80e/OZgw8b2cgfPW37PU5mkLY79sSiZRW3h8OUdXArn/5o+Wij0LkOqfQOVmZY7Cz14/Gox6NiE3LjKy00+pyY2d9g60MJ0Uy78kWQmAw81awI9OXHXc7xZEKbUoc6cwG21gDYFxEdriN8vCUndTXTwdUnQpVtzYH+YpTmbNOa8IT8xntbb/N5eYVztUushqu4Ssr0xcmp9AF0Wwm/w8PblN1KvzijRc432zyyzdep5+MWAhrDFL7eyGUlCRxSi+OudE5z9ZoxCtnT3P98hlurq1wfW2Fc2tLlFnBe+9vUAl9xtOIb//gLr3BrKtle8rDxzswzmg36zjKoT+aoI3hvf0dqm7I3cOHfO78J/nuzns8HW/zzWd/zNZki83JI7QuqHstXuy8PF/bXOcYY+vxR4mXO3PLH63J0e8LaPsLGGMYZxNc6fJ08tSOQHWq+CpgffhgNqfAmffRG+xgpNLYSYVHEwaPBgx9lCNKC4SwDNpRgu1+RG+c4juK3X5EqTWHk5QoLegElr0raafOTTPNwSghzUoOJxkLdR8BDKYZp5oBSWHr55faIV9b7wN2jK7nWHhwlWSa25YuY6DmK55ftWaxrd6U0HNYrFuQbwQOSoh5n30j9GZDfByWaw4vrtT4wvVFPnV2keeXmjRC21ufac0wzdmeRuxPch71UrQ2rCxVeenqIp969TSlNjzdn+AoydJSBaUklXqF3d0xoyjjD3+4w9KpNpevr3H67AJh6KIcBfUOuigoixJ6T7n9h2+B60G1CekURj1QDn5nEcqcSffQjpPO7RCi97eGvHapwwdbA9b3pvzhgz639yY8HUdoY1ise/xHr5xmOmu3O1qzWugShi6r7ZC6L5km9vvmdNNjZ5RRaE2z4jKcZtzdnbLaCvj21sC263kO5+tV/vFbW2SFoeo79vpNc9o1W44utGZvlBH6Du3qh3eOfHgqe8SiZ/3UcnHBgndumaBoNizjA8TSGczo0IK5lHZO/NEvYYHjWeq1mv2Z54grNzDvvWXBQwhEs2l74meDXszO1qz+P/vFLt1du309m3l/NNteKpjM+swXF62icOMW4vmXER//WeS11xFL5+zrgoodgOK4mOEB+ul99Ht/BNMRdFZse+D1F20NGyzLv/MevPC6NRjWG8jnbyKev4V57x1IEqKvfNsmO3t76PvryNDl5vOLrC1WMKXmbnfK9g+3ydIST9jsdTLN+YkXlilyza1ffI7mFeszIAgwd+9gHj2wLYuOC5MR5vZbtl6/sAoIcFzkpZcQP/mziOvX7JrO5tjLtdlsgzjGTKbIpY5duzxH3HwBBl17PcoS+Su/RvH9txGf/xXM1gPE0mkr7zuOVW0O+1Zp6Q2s0382TthoDb2eHYn8EQ47EtW2bUkhqbv1+WAZRzg40p2Z5BThDGilsMBbaGuWs0No/j/q3jTIrvS87/u9Zz/n7rf3BY0GMMAMBrMPh8N1OBRJSdRCUZEsyYkTR0pSVlWqnMpS5YorlUzsVOx8SOJyPiSxHcuOEzuRRIW0JJoiRYocLsOZAWbBvjaA3m/ffTv7OW8+vKcvrCqRKeuDa3KruoAGum9f3Htwn/d5nv//988RaHiGWyjdPSy9CHIJWmqfjMAzvJla/jjIxtZtSqaHIVTwyrGa/1hNrhd0PD/1EUIU3nqdqlWdCQSb9hxls4qpWeiaUaB6Te4M7/BO5x2+d/BGsc/PuNS6xrnGKZpujSRP2Kyt8ebeNp9YewlT16nbDj9x/iy/+Pjz/P6dd6nZNt+7cZf9dp+d0Yh3726TZhkXLpxibq5Gmmbk45jxNGA0mYKhQcMmThJOPbXBaDLloy9f4PTmKp5psuB5fP32Td45fMg3t99QVqNowjfuX8fWbU5XHkNCoYQvcbZ+jrqleAzHoUPHu3gFBsqoWhWiLCLMIipWWXnnhTpYPTv/PHvTPR6rncVPfZziezUUIe+YSAjKZeEaLkmmRIHjZIyffrB1JsAMtepaBpWiOGa5pDeJODFfnsFo0kyyWS+xM5kWsBVJkucz8pqU4Fk6nqPup2Rr6AI+vF7ldy630BAIASeb9uxnrdYVoKfq6JyaU+Pi9jQhzXOyTHHyq45Oliu4ztVDn2mUzuhwFVvZx05Vy5xplNlslnAtFbAzjVLCJOPd1pD3j0a8+VBNXSZhwl5vymMrVXRdwzJ1Pv5Yk+k05iNnmjiOyfpCmcfPLXDh7Dz7LTUGHw197t7cYzyOaLcGxNMppXoVu+RiuzZ4NbRqUzWihgm1ZbBLlNY2iAZDHv/sp1k8tY7rGliWzp2tHjv7Iy5tdUlTxQy4uzOg5uislFyyXBImqmD/+vNrfPyx5oxHZmgap5erVByT7jRVh7CKzd4wJkozXjxZI8slC1WHTEp+88Mb/OBul58/P4+fZDiGRpLmOIayHo4DdRA+Gqr3kKpjqJ8fZ9zcHdCfxj/y+vnxo/tjEptpquJ7dKQKyLG/GlSRXlxElOrIa5cecdiPjuD06UckteFQcdGlVOr7U48pmtxkonzzq+uwXOyQSyXE2SfVnl0I2HuInCQzat2Ms1+Ac+SghzAtxa1fOwm1OUR1DtFYVMXddsEoTjx5CtEE2dmD2jwMclXco0B1s7vbSlkfBIgTp9Be/Anki6+qMfdnfk7dT5YgyjXk3g7iw5/AA7pfeh3DEFg1F73iYs5VKE0iSj/5Ms6Vf87mv/tZxPwip05sqp83nYDtUHn6JeTbr2O7niL22bYSHkqJ7HWUz79cQTz3snrsYaGmd5TVTnv2VeTKafKv/GM1LQlDtSopwntEpYw4cUKN430f8ZHPqQ7e8yCOyX/3H6LPNxDLm8jeAfnv/SPFTADodBAL84iFBfRKRU1mAFxXrWEMAzn9YL9JuoZDVrzRO4bDOB7j6PafoqEdB6NYmk0/6s3+fBgPWXQXmCSTWXhL02moBKo8pmyWlTq8wNMuekvFiHiMrdssukuM4iGpls5Y87pQqntLsxAIgizE1AyEVAXfM0pIcuadBabphIalOldXL2FqilqofPED7g7vcKp6ilE8Ymuwz43ONk8vnuZoOuHvv/d15jyPD6+c4SfWPs2nVl/B0iyaLzTwCqCOIUz2Jx1eXX+Ziv1dvnrxCnfu7WKaBgvNGuWyR3kS8G9/6uP8zb3f46//yhdZLs2zUTnB3nSPbjCk4VR5ceF5vrX7Oo5hc73zgIZTwtJ1TF1nGIbMeR4LXpPffP7zxHlMKzjA1CzmnHls3WXeWaJi1nivcwk/DdCEoGJWyAoevqmZ1IqDgJSSk+VNHk4eYOs2QepztXeZqlWlbJYZJ2Nud29RsxVhsBf2KZslTM3EM1yFOtZdAqlcOLrQCPPoX+MV+ee7LVSdGSEtzZXQq15S3bImIEsllqEU73XP5F4vxDZ08lxyeWfEueUK99uq+B8OAp47USWXiur25JJLN1AgorWGw7kFh5WSy7X9CUtVm4+cqBKkGff7IbeOAlZrFt1pSpZJqiULTajUtmmczcbwdc/iyZUSqxULS9NYKbm4pk6jZM0mBd1JzMORzw8eDvnIRpXrR1MmkQrQWaq71Es2793r0Kg6nF2p8rPnlvjCE8sYmuAj6zXmPVsJETXB//LWNq+eqvO1210u3+ty5dIWeRzRWFnELTn0jvr8xS8+z//5pZz/+jc/zlLZYqNSYhDF7E9CypbOk/M1vrvTwdAE7+1NcS2dzigiR+KHaid+sunw0xfmEQJ2Jz4LmU3Ts9A1wUrD5RdLK7QnCfuDAD9KeaxeYhRmmLqg5Jg8s17BENCdxnzqxBzfediZJej9ne8/UD+jVuJwEvJ/X+3wxGqFHMnBMGS16XFmwWO17rDbV1bLkm1Q8ZQ2I0p+9HT1xxf6MHyESD06Un73PH+0t/c89XW2/SjUprBrkaaIk5tKODYeK+W3W4LJSI3V/QlyRyliMU1YWkO+80M1KTh7Xo3idx8gL72lvmZ9fRbQIppNBcE59o835mBFdezaySfU10sJhgVZApoBSAXyyXPy1kPwxzDqIVY2kafPQftQceefeg7KNcT6WcWiN21lc9NNRHMFbG8mBNT+wn+AqDSQG0+w8FO/xOQ/+2sIy8D+uZ9C9jvM/yd/HRlO+Ylf/vcR9SXkoIWoL8FLehGyo6uP8x9VgkdQv8ahesxpBKatwnImA8gzMB1kMEbkOSCAFLF4AvH4BajV1Y6/UkMOeqow6zryqGAhmCYymKhDS5qi/cpvkP/ObyE+9grywTVorijRpec9en09D5lls8OeKMA/Uqj0v2z6wX6TzGSuEutkSlDAUo591bpQoTNxrhC4WrF71wpC2/Ho/HiUnGbJLM1OFyo5TuFUcwzNRCC4P9oiRzLvzs9WBb1QHR48swQoC9myt1Qw9A1ymeGZDp5RYhD1cY1SkbjnqhjaPJsBX8IsIMljLrUvYes294ZbXGheoB8OmCQ+URrz+dMvU7cbnK6epmrVMYRJnCuu/rnaeUzNwhAGQmj8h8/8e5TNGheaT/FXn0v4tS/9F9iOxV958bMcTjt8/gufIyfnr/yt38AzyoziPlWrwctCV90fAiE0zlSfQJKjPa4Xh58QU7NIpYKGhJnP1uguju5Qs+pM0glxFpHmKZZu4+oeJyubDOI+YarWKwYmfjpFFxqDaKBWIeT0o96MXnih+RSXu5d5rHaGYTzAEKrRyPJslhdg6RamZiornlSipZJZIixEfHH2ozuhD8rNKZLQojRj5CdsLJSoOgb7g1CNvw2NPJeFsh2OJkXaXdEFWoagUbYI40wp79O8CJcR+EnOWw9GbLcnzFUsSqbBl6+3mYQpz66UyCV89/6Ine4UU9eYL5kslk2iNGeloWAthi7IUWPpmq1zOE54rOFi6zq6UOErtql2ymGikvHCJOd33jtksebw/uGEF1YqHIwUMnccpvz8MwtsVE7SdC2aBXBm6CfommC16lIvWYW4UPAffWyTqmvw9HIN8akz/OdLFbrjkF94fgXP1PjExjwA//EnNjF1DT/OaJZMJCVe0o6TFOFXG+tkueSXn9bIpSRJlW8dlIbAj1Iu7fdJc8l62WMcJySZZBKm2KZG3TP5pfNLfHe3RxDnrNRMbh+FdCYxuhBsdUJcUyucIzFhmqMJ+I0X1/kn7+/zhacWuN0bY2ga7VFI1SlhGhqebeBZusLraoKFikXdNShZOkdDhFwR0wAAIABJREFUnZAMP0p/5PXz/826P+7mm83CY16gT5eWFDwnyxCNOWXFK9TwMggQp08jw2C245dxqrrlYw9+p/Moyc6yYHtLfd5swvIJeP9NpD+dKcVFowkXXoKL34HNszC3rIqvZYNXQdgectRBjvuqeOYZYqUJuKp4pimkMXJwBO1dFTpz+xZ87mdgYb1wDpxA1BfALRCZUiqRhqYjO/vI1kO0pz+pCq5hIZY2QTcQlgv1Rcr/699DOGV1wLAcMJR9C5krzsD8OugFtCZLFWnvuOCjqeIuc7A81H6hqn41LHXYEIo/L2xXHbjSWL2EMkX70GfJD+7DD/+Y/I0fIE6sk+8fon/oReRRC3HyNMwtoi1vwhd+g/zW28hb7ykh47nnkHfeh4c3H4kYjw8HjhqDyjyHdlupz4/dFFmGOf+jE5M+CDcVaqLCa47V3kEakMkcXTeKwq7hGg5BOlXFHcEkDVhw5mdBNglKse+nfmHBiwizAv5SUNgmybiw7RnYukM37JBJFXATpD51q4YmNMaJiWd4CBTi9ngMn8uM/nHOejzE0R3WShtIKHzlEcNkwP3RFqNoyt7kPt+5d49ffWaPZ+cvMEmmOIbDhcZT2LpDUjgCdv0HANwf3edi6wq/eu4XyWRK056jbs0BUDbU6/jlX/nvCwiNjqXbaEKfuWjSPKFmNdGFTs4jVLBAcbglFM+fhqt7hdffI5fqIPRk42kVu4ugYc+r1Lw8nI3hl91VDM3g/ug+vahHw64TZzGL3iJ+4lO36xiawYnSKQzNoBse0Y96GJqOo7uMkiGTbELNrimRYho8OoygxtjHHX5axOIamqmihz/gt9YgwDZ1GiWblYaLVXjS/Sil6irqna4J1usWu72AKMkxdEGU5lxYq6o8L1MniFRB2BtEbDZdWuOEh92Q7jhipekRp5KDcUSS5ZxaKLFScvneTp9myaDmVtnuBazVLFYrii3hWSp45mTdZrVqUrMt6rZJ1x/SCWKCJGez5rFSd4gKDvwkTGlNIq52FGL3oB/wvZ0B22fm+PS5BrfbAecWKnzy5EIxLcg4GkbsTwIGUczRNOH12z1+/eV1yqZB3bFYazikucSzDdIs57/5/BOUi0OPZagDRprJInJWTUHU9Qp+Ef17/JFJoGDaG7pKBdSEQBdQsg1ePb3wp5C8aa6Id1LCwE84uVDCMXX+r6sH/M7FA55cr3M0DPjMkwvc74Y8v1amZpk8u17nhRMN9gch19pDyrbO2UaFu/0xrWnEueUyWS65e+RTcgzqrk5SCDF3e2qS6sf5LOLX/DGR4eJ4d/Vn3ZLf/LwEEI6tCnu3q3bg3a4Se0mpMus/9Rnkg7tqDH90pIRqzz6PHPZV8e50VMjK2qr6XNfV2L0A7bC6quxa9brq2E89pvYn46HKc48iqDURbhnZ2UfMr6rIV01XBVPTEYaJTBNVyHOJsGzEyhnVmeaZItoNWqoAJ7HSE5iWKvrNFfBHaOdeVH8fBcjePvLmOwqUc3iIePZFqNbhwR145iPQ2UeceUY9FqE9mmYITXXhs+dVPrIbJhGYjvozoalpQ5HChczV44wj9RiK3HPy9NHhQDMe3ScUgTzBo+lFMCZ/908UMGh/GzmZID75k4qsJzTwqn8q0EZ2diAKEG6ZvLOnRHjwKEvANBGeh/R9JXI89syXy+o1mZuDyQTjb/6TDyws/J/d/cfymCRXscpEWUTFrBRENJWMFmURJysbhQJcoxv28AyXhtPAT9SId5yMibMY1/AKuIsSj00Lq9eyt6T0AEXRbzrNYr+vRGSa0JBSHRTMWchLjlFoASTHgTgOo3hIO2izWd1kvbQJgJ9O8NMpD8b30YTOgjPPIB4w7yxwb3gXrYhd/fjyKxjCoBd1OPD3+d7+m/xg9x59P+DVU2c4UV3mD+68w7/z1Ge43r3Lh5ae5kOLL6MVfnRH92aWs+NbLh+NBDNyVdaFKuiyAPrkMicnJ5c5fjrGMyozT/txGI+uGegcI0IfuTWCbIoEbE2J7Q6DPQC6YYdpMuVC82nqVhNAHRSEmIn2wswnykIMzaQdtOiEnSLKVtELkyzGNVwkknGsxIjHDIVj7n2Sp3xu/Wc/sNcwwBf/wUV5XNRNQ2PkJ5xbrdIahjimItMFccYvP7vE9x4OSTLJQV9lun/qXJPDseo877enBdDFJYozlXyW5oxD1fk/faLG0Tim7ppM44xXTtXQheD+IJgV9HGc0Z4k1Bwdx9TIJcy5JmGacTRNOFV38QwDUxc8GPqcbZQ5NV8iTnM645hpkvJua0jPT5krmfT9lE9s1Lm4PyJMczYaFj9zboU0l/QmMYeTgN+/2eHtG0cAzDddDF1j73DMFz+6we2jKecWS/zShRVMXRAlOSXHIEpUrvuxvkETYiaWU8E0Ksb2+DmwTY0gzjAK3O4oSFkswoKEUDoJ21T/Zl0T5BKcotvPpNqVS8A21HNyvz1ld+JzuxPQnib8+vNrrDfVRNHU1aHiOABnFKgJhyagM475f24eEiZKWzEuhJEVWyuAQymdUYRr67OgmwsnG0RJxm/9xWf+zOv4x/vov/7br4kzZ2aYVnQdceas8l47jgLZ1OuIJ56Bgx1IEvJOr0i4K3C1/b4S2tkF5OZfPjAcC/uKMT+DvtrPS4qOMYelNcT8KsJykVmCWD2jrGb+CDkZIHKJHHXVAaA6h1abR3hlNSLXNTXKjqbkty6qUb1XQXYPYfuOGtWvnobrlxCnzyNKdUhCZDBC7txRSvtTj5P98IcILYeHWyRvv4+WjJS//a3XoVlHVJqq4CYRMpyqLh6p7HCarjrvJC5G/kWHnyVFR69B7KvzwLCt4mwFj7r74pCioqQydbAB9ed5qg4ORsGxtz3E0oZaRzSa6B/7acT6ObVuMIwiqyBUP69YSYhyXf2+Ng+OiUgiRRdME/XaySICdzqFtbXZIUCcfXymodBe+cIH1oN8s3/1tQV3gUymRJnqzsuWUskf29ZMzWDJW2aSjEnzjFE8mmWVpwW9ThRivpJZwtRVPrqpmcWO3SywtBpxFuOZHgiYxBMkCsgyjIckeUI77FC1qliaxfZkh340IJc5vajPUdBmwZ3HM0usl9Zp2vMIBJN0iJ9Oef3guyR5QtNu0vIPuXR0haOgxVNzT/L1h6/z0tLzNOwmUR4ySobcGtziWuc+f/mpn+VrN99mo1nnanuXS9e3OGJAJlO+vX0V0ww4UT6Bn06ZpGNa/j5ls0wiE4ZxT4kUZUaYB0XOfWGxJON46OmnE3RhcGd4g5JZmokfc7LZNOLYDpjI5JH6vUgONIoUPhXqU8HVPRzd5nT1LK5RKuYsopiqTPDTCY7uzlYmQgiqZq3Ywyfomk6UxUUUsD4DILmGi4YgJ6NpN5HFoflU9ewH9hoG+N33Dl57ZqNOhqA9DNW+d63M4VDFzCaZpOKavLIxx92+T5DkdMYhcZpTdk3SHPYGwQyxahk6G02Hhx2f1abHYKrSFNMcTjZdru+PubBaZhxndIOENJfcak0Zxzm2IRgEGWfnPdbKLjujkK1eiF0gcHeGESdqigz3WLPMYsUhSHImYcowSrh0OMLWBatViyiV3DicEGQ5mw2bi9sjLiyXWSo5tIYhvTDmrb0RNw9G/MzzK/zgSovF+RKTIOHqO1vsjTMwBLf2Ruz6CU/MlZjEGcMg4UZ3RFlXgr/9YYhr6oRJTncSz8SMahyfYeiqxT8axWia4O29HlXbRJ8RKSVBnBMl2Yy/HyYZaS4ZhylBnFF2TLTiAGDpgqprslJxOVFxePXMHM2ShZTqvsIk53AYcThQXIHjQ0iSqgPIU4sVTF1NTKZxTmsYoesaEiXKW28qKNLAj/ncUwvF9EHj808s/JnX8Y8t9Pm176u/tCwl9KpWVUpap63e+IXyVWvPfRT5zhtqv24aCMtSLPu2ovkcd/9iRe2AZbenGstSSd13pQI726qAnH9aFZClVZhbAN2A0EeU64gsVcK64y73cFt54jUNsXoaYZiAVLYJ3YA4QEY+8vL3YW4ZUW2qHHjLRqxsgj9SCXXVumLZexXk1R/A9YsQTNXK4Yffgywjb3fRsoRs6JPttdBrKv5WhFPkRAXRyN4BHDxUfvM0VgcJ3ShmRJqC8XT3EXmmunvdeDTVSCLl69cNVcwlqtjGoXIIjLrqvo4RvYW7QQkj46LwZ+rn2K7C+VpOsfMP1MFo1FWFPpyq5yoK1N9pOvn9q2hnnkGcfVZ17MMOdDuPkLphqMb4c3PqWrAK+6Pvo336Fz+wb5L7/s5rmUxxDZcwiyiZigPvp37RgUoEggV3kZbfUsVGN7B0m5pdnUFrcilBKEyunwazyFRNaJTNErKYDOTk1OwaQZHBfoyrHUajgseu4RgOk2TCvDvH1e5NcjIqVoXHamdmO+YFZwkhBL2oQzs84qsPvsGSN8+8O8eit4Rnesy5De4PtplzG8y79Zmw75/f/0O+fOfbeKZJ3S7zT6+/ztQP6SYRYZoy8UPa3SG5pWMbBqkMeDC6TyIjWn6LTthllKhMbz+dommCpFhh+OmEm4NrRHnAOBlh6XZxsIVh3KNpz+MYHpMiGGgQ9xjGfWzN5ig8xNB0TM0iTH11nhUqxjeTGTl5YX1T+31Hdx+tLbKQ3ekDDoM9ZPF1oCYdYeYjhMa/2P5DTlVPs1raoGrViPOQUTwqwoTygoxnUbEqRHmMZ3gEaUiSxx94YM7FvfFrACVb52AQMl91qDg6nUlCbxKhCTWi/uy5eb51t0ecqiAU19J5Yb3C/a7irFddJeg0dEF3krDXmWKYGmXboOQY1DyThz01xVqpOUjU21l7kmIVbPy6Y+BaGjXHYHsYUHcN9kcJDddgzjN5bqmGreuUTIPFmkOUZOwNA272xvzvb++RSTUe36g75EhWazZvPxiyXLPJpGCjbpNmkv/h9S3+5HaHqw8GPL5e54d3OhzuDwiiDNPU8P2U7lEfzbQoly1SKXlzZ0CmSW52ppRsjTvdKQgI0gxXN4iSnFzCMEr4ys0jwiwhSnOOh1aaEIyChEXPwTV1DicRlq5xr68CeCSwPw5IM0nZMdnu+ziGjmVoSGBarEamkbIy2qYCXEVpTpTm+HHGO/t97nYn2JrSL0RpztBPiBKFxf0Hl3Y5VXO5sFLjzFwZXeTcPFJTr3GQMPRj6iWbpYrFKMxwLQMBDMOUL1xY/FcH5oi5BeR4+MjXHseK+24Y6vfVYj/rVVQhWF2FVgvZ6yMmyhc+C0mxbfJr1xDlMnmcomeZKiJRpLj0a2sIz4P55RnVDqcEpqVy2uMQyvVZcZTtXcTyBjKO0BbWC893EbAiJURTZDBG7t1D3r0Bly/BiU21Cig34OpbyF4HqjXlV6/PqWJ/dIC8cR2aTfJ799HOP47W7bL3B5cAWDi3wO6VQ9ayHHtjgbzdRtN1ODpQcJvN88id2+pgoZtQbiCnA4RuIPMi5nblFKJSQBqMIv89z9ThJSmEeHmG7O2D0BDWmpo2hGM1ftf0QiQZF0W+2JknsXp+oDjkBArxW1gJMSyloWjvIkxLCftGXeT+Xbh5GRbXoVRDnHlOPZYoUkFAcay0E6UScjR6BCzKc8Ty8r/C29W//ptXFNtpMao/jnk9FtOVTLtA29oz7/Y4GROkgaLpZfGs6zse6zu6Q5RFlAwP2yzP0LQVs4IuNEbFiNjSrFlcqq3bRFnEcmkZXeg4tsPedJ+PrrwEwFppXT0eo4JEksiEUTxgd7rDIBpyf3DEu4fbnJ9f4sPLCXW7xjcevM7loxYtv89GdYkLc48zTkYMozFhltELhnzzzm1efewMo1qN3/72W8Rxwom1RW7ce0h/MOaJ85u85e/xsQ2D0dF1ztRP8NTcBW72b6n8eN1i3llgd7ozYw/c7N/h2fmn2Kyc5uHkPsvuykykOE0nJHlMJ2qzqtvcGdzhRHkdS3dYcJbYnjxgo7xJKhMsYRNm/mw1YOtuQblLCbOAUTxknIypWXXmnAUO/APqdh0/nTKIBsy7C9TMGtuTh7zbvsyNzjbnG09gCIOyWWWjfIpc5rT8I/ICs1syVdqghmCcqNepan2wdSYAixWTSZgxjDLqJYsozUgyta/VhGCh6hCnOe1RxCRKWKk57PUD+pOYXpDSHqlY2vZI4VLbPZ+Ka6Lrj8KF4iSnFYdsNF1ao4hBkHJjf8TLpxskucpnj1PJOMo4O++iC8HJmktrGvGpTVXcq5aJoQnKjoFlKC59fxrz/Z0+QaI64u3OhHHZ5sKSx3rZ5bcvt7i/N6Q3iVhteAwi9T5u6hqmrnFiucK33trm1GaDlz+0wZsXt3lw+TZGpUp6uM39yRT/zAm6JYuN9Rpv3Oux1izxwmqF+121Dliv2fhJytXOiKptcLsT0BrHzHkGS57LpVafJ5rq/54mBKbUOJgG7IxCclni2tGUJxY8FjWHum2xPfYpWQZlU73ftkeKjBckGc2ShR+lJJlkrx9wFIRMkpSGbXK6UcZPUlbLLuM4oR8llE2dhm3zxl6f7X7E4SDgYmOAqWusNRyeWanzYBDw9vZIiQN1jZqtAoI826Dvp6R5zmbzRweM/fiO/vf+t9cwTdjfVwI711XHuzieZc2LpWWlzj7cVcWm21WdnmOTbe+hlQplfhCQdsbomyfQ4kh9zeKiotgdK7ofOw/VOcTyBqLSQNTmoHeodvC2+6iA91tq5KxpaAsnVJRrXowRLQc56SPvvQ/BBAYdZb17+kVYP62EbbU5xMlzina3/wBCX3X1aao6bXKyqzdJuhPygyP0lQU8I8dIEm6/36JcMrl/u0v7+iELmw3EoAeei1hZh34LGovQP1LK/jwtsuwlTPrKC1+qwXSoinsSqT8HhKapycD2DWTkI0xbAXwMCzlRY3SBVNoCX71JqccrHx0Oxj31PAiB7O4XQsSi4zctdZ+arr5/3EVefgN59V1EpYo0dfXz3Io6hAQTuHXjEV3wWGsQBOrwtriI0DS0D//kB7YbutJ77zUpJcN4SJwnuIaDoRnEWTzbny+485iaxTgeYepm0cULymaZbtibjavDNCTMQhpOgzANZ7haPw1wdQddMyibZcIsmBHbPMOjFRyRyYym3cDQDI78Nu2gw5zTRCBYK60TZIHi8AuBpdkc+Lt87eE3yMm4P9yhbDm8vPoEz8w/iWt4zLsLvLD0LC8un2GcjGgHA+bdOlLmeJbD3viId/b3OTrs4RuwVqmwtNREsw3u3d/j5Noih+0B92/usLw6z954RJynfHL9ee4M77FRWef++CF+6jNJxiBgmvgMoj4Np8FqaZWdyUO0IitglIxIczU5GcR9umFXdfyaiaYp5cJhsI8udCSSilljEPdmEKJJMsExXDQ0dqfbeIWIr+W3Zoeehq3YFjWrjq3btIMjBvGAH+y/zXuth2hCULUtptmUFW9NrRuygKPgCAGUrXKhDZAqTCiLKZklJHCycvoDew0D/P03dl7TNY0H7Slxqro4Q1dhNHHBl39po0wmJd0wwzU12uOYXEpsy2SnO6XsmgggSjI6w5DH12oc9NQ4f3PeY6eniGv9aULJVgV7rekyjjLW6jbb/YiFksl8WQngDqcRrWnCvGdSNk2Wy85MmmQWtreL+32+fL1NLuFBW/ni15suZxY8lbVlmpxd8Di5XEIzDGxTJ0olNUdnksJ+z6fVmdLa6+GWHUxTJ80FC+uLjIc+uV1CJjHTB3cQ1TnanSlS03h2s8b1ls+ZeYer+1Me9iPuDQIqts7uKMYzNUqWzpMLZS4fjVkoWcR5ziBKGEVJMaHLifOcIz9Wga258rQfBSGeaRCmGSXL5HCirIxRlnHoh5QNQwn0wgjb0BjGCQ/7ETujiN1xwKl6iSyHeddmHCfsjSOkkLyzN2a3O2UapaQISo5yFwAg4c2HA4QQrDY94lxiFGCiSZhwas7DNjU+car55xjd//HvvsZ4rHboloU4e069wUeR6iIrFcSTz4JbQmQx7O+Rd/vk0xitWUMztEeIVMNA31hDPPsh0rcuIfMcbXlJhbE4jvLRLyj7nrBsZBKrYiRzRLWhxHT+WI3rdR0QasesawjDVAVMCCW0u/4m8tq7iF5bFafFFbX3djwl2guU1U/EIdgWnHxcFUNNR379D5h86x00yyDpTmg/HBDe3cdbrqF7FvgRg0HMJM1oxSknLI3WlX3cYITe2VMj+FJhI2wfqkOMzGHcR5RrSmwoQA47CKEhSlWEV0EITY3WZY62uKG0BFtXihciRTSWkXu3EXOrauR+/D8qTxUTIFFUPOFVZ1Y92dmHvS21onBKyO6Bmi7A7HAk71wtJish7D6AlROI2jzCtNT0YdCbWfNEpaI6+Th+lJjX76O9+sUP7JvkveGt15RyXgnyGk5D7eo1DUmOYzjU7YaKKhXKOz+IhmQywyiKkiLeKU/3krvInDPHgX+IUdDs2mGbslmialWJM2W5S2XG7uSAqCDxLXvLSKAf9dke71G1y9TsGo6uxpeGZuAaLgKNYdLnK1tf5b3WNofTTtE5rRaBOiUadoPDoAVIdie7VOwyp2sn0DUdQzP4uxf/gDffv43nOaRZzq07O1y8dR/TswmCiDzLGY6n+AMfUoluG1y5uoWwdL6/f4OYkLrr0guGtIM+DadKLiWXO9c5VT2JJJ8VSk1oNJwmFbNC1apzf3wPq/DILziL3BreZppMKRke66UNbg9vsVHeYJpOZqLEIAu42L6IEGo9Mm/PK9eAjNmZ7PCdnTdBqEPQzmQXU1fRtGGBJ96btvCTkCTPebf1gNVKhc3qKUzNIs4j4jxSNkWZ4RgOfqImMBJZcPVDHqs9/oG9hgG+crX12iBQIkPXNnjuRJUozQmSfJYY99RimdWyR07G3U5IexQSxhnNso0fZSriVNeIs5wn1uu8slnne7c7oAk2FzzutSZUPJMXNhTqdRwkZFLQHkcFvChno2FjGRrv7E3oTFPWahYrJVdJeQrFhmXojMKEt/Z7/MGVNlv7Q5JcaQjutcZkEp5ZKbFadrnemYCQPOjH1B2DqmOgCXAsjX/6zS3u3tgjR2MynHD44JCdnT6Vepl+d0KapCTj8UyQHAub6Z2r5OU53nh3h0qzRJAKehM1lau7JsMo49rOkM15D0MX7A5VzK4Q0LAtljyHmmUxjBPF788kZ5slrrSmCAGOKTg/V+Vya8R61SVMlWI/yXOmScrvXTnCdqBsGaxVPTQhsDWdO32fb19tITWdFMn7hxMykbFe8eiGMdv9iEmU8+BgRJblHHZ9Mt3gwkKZsm2QJpLr3Skl26QzDjkz79KZJgRJhpRgmzr7o/hHju5/LBkvP+rMirqcTNUY2St85FKqPX2loTrMs0/DYEAepugVZcsClPfe98naPbWf/8F3yCYhelPhU/MwQXzsFQW6KcRpctBWO+Ks6ISTWPm/pQSvrIRjgPDKqpM9FqSlibKY7Wyp4uWW1PfevFyI8gLFcJ8OkIcPkUc7iLUziMYCor6o/l3F4z641gIh6PZCru+OuPOde3RvttA1gecZuJpGSdO4eqNDnGSgCXa/9j5icUWNvPceKgBQuYZYPKEIe5MhorGgRHPLm1BpIP0xMg7JDx8qW18ckW/fVDjgCx9FWz8Luon0hxBHyIMt8t3boBvI7r7yCjeXH43jj3f+Mlfq/84hAPLKm3C0p3z/O3fVBKQ6h1g7qVIJ79xVE429rcLHryv8r+epUX0cq3RAKRU3Ic9VZ/8BR+CGWUSaZ6QyI82TGbVOZdDnRcysiaEZzDsLDKMhAEahEjc0nYpZIc2VqGwYj9id7BFnCQJBnMfYuk3NqhFnCQf+AeNkzIPhLrmURSqdQZD6TJMJVavKyeoac05TgXbMEp7hUbMaGMLkwN/jZv8Go2jCgufRdEsEacS3HryvLHhIdia77E322B7v0An7nKme5vH6EzTsBgKNtUqFZrPKjVsKFe0HEZPWiNt3duj3xzNYUKnuYTc9tndbNJoVdF3j3ffv8PzSGXKZc29wwKnaGp7hsuDO8+Hl5/HTgHlnnnlnnsfrj7PoLjJNJuQy5/3uezTsBmEWcvHoIu3wiCcb53lu7jmG8ZBRMsRPA97tvMPV3lWSPOHe6B4CwVPNC9wbbqELnXEyVgcBYbLkLaNpOuN4yp3Bfe4OtjE0g/c7l/EM5b3frK7R9n0eDAZowNZgl2kyJpMpo2SoxtJFPG6URcXrrfbNxxjjD/ptr+fjRylRonz0nqlRdVQTFRW+9JWSSpr70EqDo2FAlKjOv2TpNMs2z27UmEaK0LbVGvOHNzpMJjFlxyROVeOwXHW42w7Jc8UgOBwEGJrAszTVnYYZPT/l06canKhb6ELQDhTn3TE1mmUF0DmYBvhxzmLN4cRSBdcyOBqGxHHG+eUyuqZxtT3m6v6UG62AXEqeWirx9FKJc/Pq4DA35+F4Dr1WD9MyIYnJ+0fsbh0yHoxJkxThuOj1eXCrZKM+lBqMB2Omu9skaY5raQz9mM15jxyoOzrPnayzM4iY8wweX3B5brFKwzEZJym9MOb7u32CNKPtx1xr+bzXGvHqZoOX1+pMk4xBmNALUr51v8d3t/sMooTvbveZJClffGqRw1HC7iRgd+QzjGKiLOfppRInFsuMg4Q7bZ+H7SlJJnm/NWK17PDCagXb1LAsg/E4xijWLKMwZRym3OiPqLomhi6USyBTr1fJNkgKLkKW/ehr+ccWemEUHlrTJAsSKJVVtnsUqTH9cbc67MC7PyD3Q9KBYm7T66kucHGFzI8UthbU/ema4qSPxxgf/TDi9FPguAXjvQfDIm1uMlQissERolxDOJ4qmKGPqM+rDnzYngnS5KgDO0Wi23CI7B4pUV2SIE6cKwJsylCqK8FbY0mNtMd95MED6B6CbWPU1brh0vUO01xZpvwo49ruiGs7I3Z6AWGe89LjTXaimMNByOsQbXssAAAgAElEQVTfvEO1YiEvvom8ewPRmIMkVpbAO8UaIQ4hDpUFcNQDmasCbbmwe4/8f/5vkXkGh9tkf+e/JN+9Q/7gujoADTtQrimU57V3yH/v70GpimzvIdMETIt86wpy3ENmKTLLFGOg1oQ715DXLitI0fVLsLKJtnIKee1NpS3odtWBa25+Bh5SOGOFPsb3VbE/PFTFPc//FMr4g36TyJli2yt861meogsDswirAXgwfqAOBlJF0qYFex1UgQAwNH32a9WqkuUZj9XOsFHeJM5jeuFgFs/qGjZbw23aQY9RPGa1tIomNAbRkHbQYd6ZV1a4sAtSEmQ+B/4B77SucTSdcKfX48GgR9v3yaTkTPU0UkpKpseSt0jdrvPc/NNMkgkPxw958/Ad7g638EyTKIypVUrcuLOtmOALFUquw2A0YW+vzaA9wg8jnrlwGjoh/eGE9394k4VmjT9+cJXXt6/wWGMVWcB/rnavsz89wNLNIu/+kCu9K1iaxY3+TTKZcaN7h1//0t8izCLafp+/+rX/id3JLu9138PUTB6M7rPkLuKZJS61rvKffvt/pGZVuTW4ScWscqK8zjvtd4jzqHAqDHANh2cXHue91n2+eu99Gk6Fr9z9I1ZKyyw4i1ztXmVnfMA0jgmCiKpt88zCE7iGR1LkzZuaSZqrA+k4Hs9gSCowyCD5/4GP3rWUhc6zDYI45WTVpeenTMME19IxdY2dsc8gSPhnVw5IspzhMCKXksNhiKELBblJVGdfdkyiNMNxDFYbHj0/4eeeXeYvPbuKoQuyXHLQ99GEIvDdPfIJk4yDUcxa1eLIj7jfjTgcJyy4DmXLYFwAW8ZRyqW9Cd+80eH6dp/twzHbrTETXynan14sI6VksWTyU080+dSpOp853WScpNzq+nz9Vo+LOxPyXKro2H6bYGcLxh2wPdyyS5ZmJMMB8uAe2bDLwplNNZV1SkTbd7AXVznq+HzjrR3Wmh6tUczppk3XT5nGGafmHHVwiVKutkfMuzY3jnxsXSNIcv72714nzSS6EPyjb9zjKzfa/Is7HYSAN3YHnJt32WhY3D6c8je+dJ0nFjwOxzGeobNWs+hME/w0I0xzjoKQaZLymcebPNgf8eaVQxZrLt+43sExNRq2xY3OlMNBQKczJc8lnmtyYbWCY2jFtARqts4kVDHoWx1F3styyXzVwTK0mY3wz7r9+EJvmWBZxLtt9abvlZTiejKZBahw9SJs3UZu3SObRGiuhZhrPvJaW7bKMQf1Pb5SdJIk6qNUVkKxrLCRpYkSxmk6YqlIjNu6iXz3e8jpEDqHyrt+/BgLD3x++xJy5y7y+mXk9WsK9nOcXmcYyKtvFZ57oTzzpy4g5ldVody+raYART67ZpvsdH2GWco0y/HzjP04Yd4yaJoGOZKSrvP9m11WLJNEStbLDoNBpA4dw6GCAwmBfPvbKus9iRHVObUT/85Xka2HCMuBvQdKJDe3iPjkpxGVBpSqiPNPQq8FD+8gD7aULkE34cpbUCrDhz6JvPpDtLUz6r6cEtq5F5V2IQ7VRV+dQzz/ipqCPPeiesJsG5KQ/OENNe24d5fJ5W1kmivh3aiHjANII+T+jur+JhP1fIJ6jnxfvXbTqSr6H+Cb8oSrLlGlmRlkeUacJ8o2h2CcjGgFh3TCDmFR0O1CsCeLsXrJVElYURYXwSqxOjDInHE85nr/GkHqY+sWpm7yWOMknuny/MLTeKbDzuiAb+18l07QZRhNuNC8QJgFeEaJZW+VPX+HP979JncHD7jdO+L6QYswTSmZJuvVOivlMm8fXWJ/us8oHjGIhqyX1vEMj73pPpc719kdK5/xKIpoNKvs7rRgHBMOffyhz+67D5n4IdV6GTSoVUq8/bV3MVbKEKbMFfQw5VGH7+/eIcpi/uHVL7FWXsEzPCzN4k92vsffvfQl/GSKRPJguIejO5yorvA3fuovcapyihPVFX7miaeYJBPe2H+X9ztXOfAP0YTO9/feomJ5/LWX/02+v/8mz849x8PJQxzD4VOrr1IyysRFep1reHxk6SM8t3SKj6xtMo0Dmm6NcTzh4eQBk2TCd7bvcOPWNlma0fZ9tsd7BKlPJlM6YWcGNMryDF0zitcxUijjLEHywb6Gj2+2qbNzpBDYUZbTmyaEsUo9EwIOpxHvtobsDQKmYYrrGizWXDzbIE5yKpZOvWRh6hphkhHFGe32lFxKoiTn1lEwOyRoQuXOv3CyxlrN4rOPNzi/XGK5YvH9ByO6fspK1eKnzyxg6xpzJYu5ksW1oxH/x/v7bHd99g7HdLs+WSZZWyiz3PTYWChzp+/z9u6E9w58tnqhEnEmKbfbAfe7IUM/oWyrA3WpWlKTSaGpyW0aM77yJvl0rCa2poPRmKf97ltgudB+CPMbRGFEHKcsL1d468ohfpTyW9+6r/5dJeWt//0rbb78bmvGGNjuBli6RsnS+K9+9SleWmlQsjWefGyemmtwc3/EpZ0J01gdGt96OMbUNf67X32Gd/YmfGS1wcE0xDN1PrbW5ERZNYzzjo1j6GxUPJ55bI6T6zW6Y7VW2eqGvLHX52Ev4sa9Hq29HmGY0un6bPeVvTBOcy4f+AgBIz+mN1GNSZpLxkFCkuVqNfPnLfSYJvlwjDlXVmS0fRVBi+PMlPhyOEAe7isPfRAjY8W1p9dT3eCwB3NzyEwikxQqFayFihLyfepz8PhzKjrWMJDv/ECJ1WrzKlwlz5RVbmlNfQDimY8j3LKyj2k6JBH57XdUhvt4iKjVoFxGbGwgGnOIT3we8cnPI154BVFtIm+/D/sPyd/7rrLdlWrqIlo7iXznLajXyaOEhmOSSrA0QSKlEhNFCUdxwiTLGaQpqZQEec5WGHN/HLB8YRlx8iTZzj7B995V4JrHziO/9mX1eyGgUkNU6xAGqrOfW0QebaOdegpx4WVka1tNSc4/rw4lTzwH/Q6MBsib7yBe+Xn1eejDyiay3wJ/otT4RYEXlQbC9pRuYdRDHh2ogCChId+/BN0W8o0/gX6XrDtAGDqaa6q44UwJBGVcrGymUzUhEUJ18PU6MghUh3/Mvv8A3yzNKhTypcJiNcA1XEzNKPzdMUEaMIxHKgAkU377aTolzIIiEnZagF/yghhnUbUqGMKg6TQQQjBJJmwNt/mjrXepWmUWnHk2yieQ5DTsOq7psOA1qVpVfv7Uz1A1a5wobzJORmyN7vL20SVu9x4yjn02601c12apVGKhVOEXzvwkv/bEF3hp8UUeb5zjwWiXS61rfGXrD3mz9TYr3jKe6fDyytN89e57zHsecZzgVTxwdIySDX4KVQtyyag7hlQy6E+gbJK2pzBO6O50eezsOh9dP8OtVpv3rm8xiX3+rfM/x99+43fYHu/RdJqYusl64bjpRh2enDvL7nSHjy9/gs3KJpe7l8llzssrzzFNfX7y5Cs4hk0/HPHt3R/wy2d/gXHsM0kmPLfwFO3wiGE0pGbVGSWKN7DsrVI2K5iaye50h6Npj9vdQ+a8Ot9+eJNxMua3b/2ROrBlGZ5noxs6856HoekkMpnlEwRZOJvEAAXsKJ3Bkv5leM8H9eZaBsNpTKPqkKY5t7s+J5sObkGCqzoGe8OY2+2AJM2ZTmPiOGMcJIzDBMvUuHLozwJUwgKW89T5RVxL4WmbJQNdgzDOuLUzYKnmsFAyaboGfpITxDkHIwXTeWLe4/OPLWAZqsg/6E+5eNDj67d7bLXGxGnO+kqFJMmxbZ3uOOQvPL/MX35xjQ8t1/iJ03V0AZd3R3znQZ/LrQln5lRn+sLJGhfvdDBNnSwr2CGVpir4kQ/zJ1XmR+8ANJ00TlUGSRort9C4y8Zja5w/M8fe3ojDnTaebfCFl0/w+vsHXN0b8+rGnEqJcwwWyyY3u2M25lw6QcRLqzXSXHK1O8TUNM6vlhmFKf/G80uYusbDrs937g34tWeWqbgmcZ7xofUKB37IIMw4UyuT5Dm9MGap5NBwLM7UymwNp0gJrc4U09AYT2PGYca3b3aKbbjELbvFW61JmuUkec7OyCfJJdM4x7UNwiRTj7tqE8QZjqlzNAxmLLQ/6/bjxXjf/NJrwtAR9TpyMEbz7EdCPCmVtWp3V30ex2QjH73sIgwN4hjt2ReQ/a6Kd41jtPOPg2WRbO1hXDivULearnbtQkesngCviqjNI7euKnFZpYGoLyjx3LFi3bAgnJLv3UMebaso10tvI+/eAV1DvPAysrWPOPckYnkTrVxTawDbRWycQ9SaMBkojG44RV6+iChX1E56PEbTJKOdPlkiaScp+1GGpalUp2meU9E1XF2nrGv/L3XvGSxpep7nXe8XO+dwcpqc8+bFJiwWC4AAiESKwaRoW5KrLLlc/kWrXN5i2eVylS2SFsumKMkUKZAgDYIAkRZhgc15Z2d28syZk1Pn3P11f9E/3t6h/xAuqGh7/f2ZUOdM93R/p5/3eZ77vm46no+uCEq2x5Gohnn+JMrCAloihPLgU3D1HcSFR2TR9n1o1WRKXiIDexsEL/4IceKczAEYWVKYl8zK/6fVg+01CEcQh85CJIpSmIXpBVi7SXDpLShOohw4I7PmmxUZZ+s5cjKgm2NmwAxkcrC7IQmE7SZiZh4IcG6tEJrJoGQziEQCcfw+mfB36yLsbsvCHw7LDt625USg3ZH3RKOJSKU+0j76q43LzyEEMT1Ke9QlrIUlOU0ISbhDyNQ4wB3v4MOajEcduiMW4vNUhzU+VGpPRCYQQlC2KhQjBQJgp7dDWAthqDqHszMkzTgxI8Y75UtE9TDFSIFCOEdMj6EpKrlQHlWo7Pa3eafyLrv9Eu/tLfPO2ialYQ9BwKf2n2G7W+OJuXPMRGdJ6EnWums4ns354jlm4xM4/oil5AL1YZ0frL5PMmSy1qrh+j6uAvVGR4rvujb0nTElMgDXx0iHQQji6RgjMU6EHHpYgcuppVmOF6c5OjfFyfxR3itd5leOfoK4EeNm8zae7zGbKJAP51jvbPAv336ezx58GHOcG5AwEkS0CGEtTFSPstXbpjXq8tTsY6RCcWZjcxzJHuC1nbf4N1d+wvH8FPcXH2DkDdnub/Pa7lv0nDb5cJ6YHidpJJlPTHEkO89bu5epWxbNYZujuTkieoirlS2isTCGoZMNh3lo+gzT0Wn6YxuegHsuCzFG9Q694b2cA1VROZg88pG9hwG+ea383GDkMpePUW5a6IaOGwTYruSxz2fDLFf6eL5MkxuMXKJRGbbiuD5fPFnkVnVArTPC9QLOLKSIGBrLO23yqTAhXaHeswkbGrGQxnQ2SiaiETNU3t7sEjM1slGNQ/kIiZCKGwTkIzIobKM94AfLDVaqFsu7HUqlLoGARNTk+L4szb7NJ08WKUYNwqrGlWqH9sjlQDZMIWFIsE42THXg8r13tlENjY7lMBg4CEWh3erLJsYZgdWWnf2wOyaQjmNh85P4CIkPty36jsLMfJbDCxnOnpgmbmqUOiOeOVkAReG19SaaKjg2FUNRBJe2unzvtXWeOTOFKhSyIRMvCOg7HqmwxlTS4NWVNqWWxT84N0XEVMlHDOZTIb5+pcz7m20yMZ2PL+UZOT5Xax3e3enSGI6YjMk1QSEcYiJpcHIhwau3anQ6I4a+z2Q6guMFUkmvy0OnaWqcX8pwLJegPhyx0RxhuxLJ27Ec/CBAV1U6lkMibNDsj/D8gF8+O/Xz++hHt7fQMlFEbyCL94fglHgc8nmC9XX8VgelmMcfp5hpuQTBcIRd6RAyTVlY5uYIqlew37qMMZPDmEhDrih98a0KgdVDpPIEpQ3IT8kCf/i8JMs5NsGdS7B0DCUzFsz5LkGjJEfb/Q5BvS5hLsWiZLqXtuXXuRKPG/Tb0mrWacqdtevKf/ebfy5H+8PhPdGZmJ2Fan2M+EQGMiiCrZHLtKESUmRxt/yAjKby9KOLVFfrtFo2seMz8oY8fh6x/5jc7x4+hbJwFH/zNsHrL0jdQjgCMwswuYA4cJBgZwWRm5ZqepCrhHhSFnnDxPnzr6H/klxt+CMLUZgleOkniC/8MsrSyXuTjcC1UY7cRzDsS2FfrykPTnkNUgUozsN7LxLcuUlQ2sF+6zJqzCRwPYRlQSItvfcAQ2u8ejEkwtj3pZvAHKcAJhIIs8rPPEZ+BK7yoEpIk9nuhqrf87wbikFUi9J3+3iBL5PSxh27QGAoBpZv4QQOmqKOrXZN1jprxI24FNP5kry226tgak0Opfdzu3mXQ+n9ZMwMz84/jePbuIHLa7tvcb5wmvn4Al7g0Xf7lAYl2qMupV6N5Xpdho+oKlPxJNvdMpqi4PouES3CWneVO80VLHdEy27Ts6UW5p//9N+hKIKhNaIQjWKqKvvSOUq9Hqauy0hZVchfLVe+n6rAbloQ0+nUu9z34DHeu3ybkGHw6YfPkArFmYoVmY5OoQqVyVieyegU9WGNby2/RHUwIB+JYMzq7E/u41dPWbxdeo/PLv4Ce4M9giDgRuMO+5LzvFe+Sj6S5l9873tUnpA20q7dYyJS5I9e/An/4y/+Fiezp+6BcxQEn138FB2nTdrMsjfYZjIyQyFcJB8q8B8dzfHTrVe4Vt3gbnOHF6/dJpGMEQ2bDG2H6XgaTaiMvBHdMeFQDf42EjjAx1BMue/WIwy9Edr/Dzr62+tNUqkQ2/U+hqEydDy6Q4dYSGMqFePaTpf2wGYyHRmP8iUBLwgCKk2LoSddC6dmE7x0q8or18vMFuJMZqNUO0Ny8RCuF1Dtjjg3F+fq7oCTkxGmYmEWTkRw/QA/CHhjq80TCxmyEYOh49Me2Sw3BzR6I/YaA7a22ggB9fqASFhqArwxNz4XCvHiRp2bpR6OK4V6nh+QCmv8zv9xnXBYp9Wy2GkM0DSF6WKMwcCGxi4k8nLyGk5AcxfiubF4OwAzgtNqEMrlGS5/gLnvJGfvX+L8Qprb5R6npmPoqmC10iMb0TE0hdt7Xbb3utxNmOybTPD04QwHi1Gev1XnH1+Y5Rs3y+RjGrcrFqoQ7DYHzGajfPv5q8RCcvRvez6TcZNry3V+5fEF7p/KyNfEdkiYGs/sy3Kj1iWkqVT6IwpRk/lEBIjwT5/U+da1CluVHhuVLivLVeLJCBMTMcplh3RCisJ3+gMqfYfByEVVFeodCT7yvABTl+9vLqax25TBN3/X9TMLvRI1CRx52nc7FoFXxi63idx3+G8z6oNAdu+REKI1wKm00OIh1IgJ0ThBrYbwPLz2AGFoUmQ3OysteZoB6SLByCLoNhGHz0ngi6JKn7mqE4zasHhEUuFUHcwwQXmdYP0mweodUBTc22sgBBoQbK0j7n8UYYRR5g7jV7dk8e11ZGb9mHVPJIY4eOieTS0ol/DWNlEP7kM9cwrxygo+kmHsA0M/oGR7zIUUkqqK5UvC0g9fXeXMZJLjv/oA4sKDUJiW8bCZSYJmmeCn38dz/gbx9GcQ80sEd28TbG4i4kmo7kn//tYawbEHCK6/C4pCsHIHMTGFeOApgstvYFc66EMLQmGC134Mjz+L+PgnoVrCX7+OKM5LUV8ii3/jLcTBcwSbNxH5WYnV1U2EGcEvrcGh04ihxfA7P0ZNRtDuvyDDg6pVKVzsNglWbxDsbRM0W4i05BWID8f33a48GLiu7Pbtj3byl65q6IrG0BvSGnXwAo+u3WcxMY+pmvTH6WgfXpqi4gUyrOZDjnvfGRAEAa7v4gmB57uEtQia0Oj5fQ5llnB8l47d5amZJ+6hXqtDuTPvOX0WEjP3HiOkhlluL/Py9tvUBn2ZZNXsousaxWiUhtXnFw88gZgSHEkf5Ur9CuVBReoJ7IEM0hGCuB7lk4eOogqFgTNkr9dms9pgMh7n6aXDvPPuTdnJW54UwNoutEYQ0SGsQseGqM47P/2AA6cXefb8Ce6bPEY2lCFpJEkaKTa669yorfD9lYt86dAjHM7OkA03uFWvUR7UudVYJ6aH8QKfgdtnq7vDTq/Cq5srnJus8+TcQ7y1dwnbdpiIZgmCgL+6/QpfPvwYv/GxR3lp611c3+FQ+hCNYQNTNXl17zXO5E5xqXaRyegkN5rXSJlJMmaWmlXjkakHEELwtctvkc4k+PyRM4w8m/f2Nu7t9avDCgPXomN372XZRzTpSf4QtSvGrP6+O/h/96b8D7ja7SGRiESyDocO2/U+lUqfC8eKKELQ7I1Ix0yGjoeiCFRVYTB0SUQNQiGdsKay0+hj2S6W5Ui2gYBM3KTVt0lHZNeaDKn0bZ9fPTV5zzK21rKIjlGsC2mTqjUkFdIJ6Qpv73Z5fblBrT2k2x1hj2ySqQi5XJRe3+afPb2PiKYym4jw4nqNtbocMVu2S2fgcGQySiKk8sjZaZIhjb32kEZ3xMpag0IhyomDeVbfz8hu3rXHaaSqxHgLBaJp6NQgFGO4co3UqfuZnc+yUJDBZJ8+lmciEuL1rSZLhRh//u4uzxzLo6sKB+ZTrGy1SUd0vnGpRD4ZppgwWG71mErovHynweZel6lijLlclM1an3A0TCZmYuoK33l/j8+em+LovgxvrTRZrlr8wuEcK80Bk3GDt3dazKVNrlRbzCciXK+1SYcM8uEQju/zzOEsL2kqr72/QyIV4cLxCRq9EU1Do2c57HVsNFVQ6zvsNAZMpOT9GwQBiiKwxkmEAP937dbPLPSB46FlQnh9W/4+G0cYKqgq3rWbqIkxwlZV8XqyG1V0FbvURk1GoLqH2xpgnD0P11al+j6VQuw7hJg7KH3e42AXMX9EFl3PQWg6/qAHK9fg4Cm5k4+l5Jhm0CGo7kCzDru70r4HaEf2yUlDv0/w5suIsw/g7yzL8ffeNsTiMs52a1XunGeXJDCn1QB7hDhxBjWRgF6PoFqlWIjgeQG1ulSVRxRBSlMo6hqW75PSJL4wGqgYhiIz4Ac9OWUo70qWWrYg2f3ZnLSzHTkHS0cJ/ubPIZ6AgYry0LN4f/K78MGrBJfeQzzyOGJqGoSC+3v/PUosgpaJygPSh1Ci0djWVq8gzj0u/9xvEwgF9rYJKnuIh56VU49YUh6QdANl3xkJHDLDmHduEfR6+Jcvy9CiyUlpUdzZIGjWoNHAHzqoQOC6iA9TDG0bJR6VOoxQ6G/FlR/RKwikmG7gWIQ0k4SRwPFdVKGw0d0kqkfuBaU4voOmaChC0HV6RLQwvbF1LB/Os9HdRlM0VEW7d0ioWXVCmompmpzMngSgbUuwxWZ3iyvVOzyz8BiFcJ7IeIKgYLHT26EzGnKnVmd/NoPrejxyaB8Dx2Gz3eabyy/yzOIDXG1cxQ88Bq5FRA9xJLvISnML1/c4P3GC/ak5lpvrlPsdirEEU0dS3KiVWWk2yaTihEIGjc06dG2wPZiIoMQN/L6DnolgGhqWrqKoCo7n0XcGVAcN7jZ3aAyHTMfj2J6Hoaps90qcLhxnOjrNf/XS7zMdK6IrGp9ZeJb/4Z0/5HbrFj9au8xn9l8gmPVJhRL8d6/9BUEQsLg4RTqUYLdbYSaRIAgCTNVgp9Nh8cginu9RGpQ5mDrAjdoKW509Prv0SdqjFrlQbkwtjHAic4qe2+UzC8+y2tqlPhjwg5VrKEJwYWqW6XiBtt0iYSTo2J2xmFJHV1yG3oiYHrsXX/th8uD/9aD3Ub1s2yUeMaSIbuQxnY+hKUJy2VcbJCMGiiLjYOudIUEQEDI12j15KLxR6dPq2TxzLM/NjSa+75GMGHSHDtOZCMvlHocnY6iKxv1TKZpDm77r4voBd6oWm7U+jxzIkI9KOE7DsoloGrcqUv09HLqk09IWNzMZZ+h4qKrgry6VuLCYYqNtkY7IcjOdCpONmdzaaWPZLqfnkigCKh05FTB1lcMHcjTaFne3WrI+RBPSdu3LKS16CCJJ8Fy0wjSGaTAwQiRSUTKpMImwxq1Sj4sbLTa22xQKMiJ6Z6fDS7rKXC7K44sp/rBvEzdVzsylOFaM8s2rFdqWx15TPg/DUMgnQrz2wR6GoVKYSEl2ftNmJh9DAUaOz9p2m1/4VJ6+65KP6kxGw3zzSoXru10+e6LAXn+IIgQpU0brniqmqHRH/ObZKLqq8MFKjVsbTWzbI5UKEY/orFW6ZKIau02Lnd0OmZhJvTNE0xT2T8SxbJ9cIkSlKwV5mvt3i0p/5o7e+ca/e04xNHlcCECNGBKGowSIIEB56FHQVNkNeh5CV/FtD68/Qg3pqPsWEdUSqALNkDelsjiPcvIBgmZFCvUSGbk71wwJxWnXCepjZnyvC4GHyEzINxghRWgA5S3Y2gLXRT16SBZAQMTikrNvjyCVk918piAfp1mTf2+GoVaRiv/NVcTR07B+F3Z3JK//Yx9HrNzm6s0KA9+n5fp4yDXm6XwMIxAcO1EgpgkKCZPibz4rhX/3Pw39FuLwWcTUPJS3EZOzBBurcPEdgtWbsL2GWNoPlkVQrxC8/gLi2S8RvPmSPAQJgfXXz6MfP4xy6jSVP/0+yf/6vyR442VEKgWtJiIWJ9jZQMztQ8RTBG/8CHH6EZnE53tS3Li7Jg9RV9+G2f1SPa9o8rWOJmWIUHkLr1RDmZki2NhE5LOyeFcrYFl4jS5qNAS2Iyf0sRhifhHKJZlz3+0islk+ysCct8tvPRfXo/j441hJl6FnE9IkxvZg6iCqUGVmvBD4gYfn+1jukKgeIaJFaIyaY1+9iqHqqEIlqkVYbq3SHHbIhzOkzCQxPYrjO5StCls9KSBzfY+wbpANZceHDpmxjoDt3i4r5SpCU5nMpFCEoDoYcLI4yWOz5+mOu9GyVWUqWiRuxNjplWgMO0T1COVBDYB3dlf5tWPPUB3UWW1VCWkav37sKd4rr7D63ir4jG2yCow8Dp5cxFMCTh7fRzhkkErE+M1HH6UYzfDs/DNY/oCPzz3KozNnQLgcTC8y8vpcLm/x8tZVXtu+yMFsllK/RqnX4uu3X+IfnniWrz4ZnoQAACAASURBVN14gVwkQswI829ffonPn3yYmWSCb7z6Jv/T5/9jvnnnDTQloD0aMRFLcaO2zrnJfczEpvja7W/x1OzH+KB+hYCATDjF3fYaTmDztZvPc2HiuFyrCEFcT2KoJkvJKbZ669wpV3h0YYnXt9YpRMPMxqcYeiO88QFJV/42Qc9UDNKhNK7vYvv2OLNA4Uj6+Ef2Hgb4315cey4Wlzt3hCAa0hjYngTguD6fO1VA11TKnREjR3LuPxTlhcMahqFRaVkITUPTVYQi8bLz2Qjlzghr5DKTDqMICOuSuHe7OmCrNbqn8tZUlYVUiEzIJBORFjsn8Lm916XRsDBNleHQAwUqlR4H59N85fQkpa5N2FDYazvEwyq2F7Be7dMfOqTjJq2BSxAEXLlb5/BcCtcP6PRtMokQX75/mreW24w27sga4Izkbh4wi1OgmaTzacywiVBVHrowTyZu8pXjE4RCggcWUjx1rEgqbnBhPkkyE6XVt1nebvP1V9aYnUzQ6Dvc2u3w0vUKDx/KsdUYUEiGKSZMLt6o8PTpCaYLMV57d4N//ssneOV2Ddv1qLUs8ukwl5ZrnNifI2pqvLHa5mAhwts7HUKGJmGufsBex+E7l/Z44kAeTVXojzxm0mESIZ3JuM5Gz6bRGXJwPs3mbgcjpBE1NbwAym2Lft8hHNHRVIVoSCcRMTiYD7HRsHA8Ka5Mxww+f6L485PxnL/8t88JVcHrScuRZ9l4XQttpkDQGyBMDXwfv1RBCZuyGIx3tvqx/dDtYH2wgp5LYG+U0c8cQzz6rHyzOg0Y9KG8OwawuHD3ivSL20NZlBUFcgUpwktkEYoiKUgXXybY2Zbfl83es3w5l28QlEoohTwcOAnuCJGflcE1owGs3ZZK8rt3EKfOy3F+vQpba9BuIR77BMqjz0Knjrp/icgHV9lrD+n7PkNfWo5s2+P84SxCVzHCBukvPjEWEUZlcTVC8nk3y9CoQrWEmFlApFOI+x5DHDgByTRoGsoDT0N5A9IZhO8hjpyC3U3UYRdiEUQ6S1Tpy457axPn+jLa088QXHmfna/+hMSnnyT4s3+NmJ2HkCl3+tG4HPGv3kZkC4hDp8dI4CuQyskwG89FJPPQLqMgnRAin5OCwKElA4YMA7/TQ82mCPoDRDwmtRMz87C1IUf2vo/wfZRnfukj+yF5pX7pOamKl9a3nj3A8V0SZhzHd8Z7e0tmuY8Z9iEthKooFMJ5Bu6A8qBGwohRserkw1mSRoLSoMxer0bcjFLu14joIWxvxKu7b9MaQ3c22nsoisJcfIp0KE3azOCOLV9/decFqoMB+3JZCtEotcGAjm1TqrVYLleYSEeYT8xQtWocSh8gYSbo2l0uVW4T1kxu1ct84cCT2P6Icr/Om7u3GLoj/tGpL/Lo9DlW2mscm5riUm2HXq3HOGcTAqiXmuw7NofreARBwOcunGE2PoEfBFyuXaU5amP7Q67VbnKttkq5XyEZihPg8uVDj/H0/ANMRvNkwkm+dOCz7FnrHEwvYft9npx7iEvl6zR9BxSLiVgOJSZIhkxeX7/Lre0Sv3bmCf706iu8cvEm/+zRr/DbL/8b5pIppmIFNjo7LCZnURWFy+W7HM/t59GZ85QGZd4tXyRtJlAVBce3KYanGPhtPMVCFXBucon55PS9XIEPd/W6Ir3/chqjElJN2nZHwnKCAEWoH/lC/8dvbj4Xixj0B7b0lzuyiM8W44wcD13XcHxZQMOGhqrIRD9VUzg6m6LUtCjX+uQzEXYbAw5OJVnIhil3bfpDl6lMhJEXUIzrNCyXnyw3cH1B3/ZYL3cxdZWzc3EWkzF0RVAdjLjb7PPdyyUGA4cLxybo2y6t1lAmtLk+d9catFDIxQxW60MuzMaJGgqVnst6tcdEOkKja/Pp4znaI59qZ0hpvNv/tfunOTOT4Ic360xNJdnrqTj1sqwTQnJTvHYdEc/IUbaqcO7cHOmYiecHvLnRZKM+pNJ3ubTTYaNm0Rp5NPs2Q8fjkyeLPHN2kqipM5UK8YUTE5Qsj48tpblZ6TOVMhk4Pr6q0B66ZKI6vqmz13W4eadGpdzl3NEJ3rxeplbr84UHZ/nL1zfxgaVChDtVi7PTUYQQ3Cr1mMuEOTuf5G5zwCvrLXJxDUOodC2HQjxENCRo2NLmmEyYqIogHjYY2hJ6VK712T+bojt0SETGwVcxg81xoRdCNqJfPj358xd6+8/+9XO+7SJ0FcXUIQhwG33Mpx7FX11H8V38nT3c7pDAdlFCOsriAqou7sWZutUWSuAhVBX1Y49DfDyCt4fj2NppaZUbDqQyfjgk2FhBqCrki4gDp6WCXNUJOjWCiy9JxX8yhYgnoF4j2NhkeHUNo5BAffRjMrjmQ+BOryUfq7pDsL4qlfinH5Aj/GQakUgiIlEY9BGzi/LrAx9aDdTtDeq7PdquhyEEMVUhpCi0GiOmJ2Po2RhqMoaYnJaPF4nL793dlIcYM0TQbqLc96TslDeW5SGnvEOwfldaRIJAHjRyRYKXfow4dgrx5GckUrjTIlhbQ0xN4a+sUb+yA+9fxjh1iMS5Q3gvvID62S/IQ1G2KCcvmkZw4wPExLTc93/9TxHZtFT867qMoDXkrkcsHJXoYdOUr4FjE/R7+Nu7CNPAq3cI+hZCV2XMsOMgCkXY3CCwHUQhj7dXRfvF3/jIfki+uvfKc3EjNg6rkDvZgTNkLj5D227Tc3r03YFMTxvvbtNmCkWo9yx4XbuPjzdG0MZojJpY3nBszVKYT06TNlMMPBmEM3RHXK1sUIwmOVU4wpncGUJqmLAW4W77Li9uvUk6FENTIGaEWGk2sByH1bVd4okov3Tmfs4XT6EIhbX2Fj4eXbuD5Q25Wt3k4ZmTPDl3H5er19AUjYOZWabiKVqjLrmwPISMvBE9Z8CNvR3qnS7YssiDAEVQbbXZf2iOeDxCOhxiLjGB4zsUIzmG7ojtbolCJIOmKFwq7/CJhQscSi/yTukDEB5v7Fziem2TPWuL+eQk3777Gk/O3c9Xb/yA85OH+Y3jn0FTBS9uXGGlWudQvsjuoMXdlR3eXL/N2cVZHjtymD+6+GP+i/s/Q3vUoxjJ4uPRHLX5weolzk/uo+cM+J/f/CaZqEbciDLwBhTDRUw1hI/P/sQBDmcXMTWNQiRPVJOrmK7TRVd0Ok73Hv0wpJn4gU9Ej9BzpB89pIawXIvjmVMf2XsY4F+9vPac6/moqkI0IuNO2+0hnzk7wc1d6e3eaVj0Bg625xM2NA5OJjANlb7t0h+6OK6PGwT4AeTiIfq2h6kp9Ecu8ZBGahxEs9WSWNjt+oCNUoeFYoJfOFFgfypGRNfouy6XSl3eWm2RjkuLV6U5oFodIISgWmphj1w+/cQBDhSixE2VreYQJwgYOAHZiMZeZ8T5hRT3LSR4ablFMqxTSEd45GCW5XKfjh1wZbdLxNQotSxarSFWszne0StSy+V7BAiWTuxjYjLB3BjE4wcSImPZcn1wsBAlGtJ5/VqZY/NpJpMhbpf7DFy4vtOhO/K4XR2wvxDl6xd3+ZXzU7y32SEe0vjPHlhgvTPk3bt1ypU+85MJPAVqlS6371Y4c3yKxdkUr9+scvZAnoipETZU3ABsN+Dbb2+xbyqJEIK/eGmNYj5GLioPU9mwTiKkEwQwnQjzwGwKR/gYukbI0FjMhlmp9IlHdGqtIYzjbEO6Sqtvk42Z7LQsPD9gIh2m3LL49ftmfv5CP/rqHz0nVAU8Hz0TlR19b4h5+gjCtgjaHfyRi74whVtroWgKzvoe+B5KMoG3V0FPR1EyKdTz5xAnHxzbJGwJxhlZ0KrLbltVwfMIrn8gO8VMDnH2MVDl/jTo1gl6bWhWpZVuYwNvZZ2g3UEYGvqBeQbvr+Bcv40+bEj0642riJPn5d5/eglxbCz2+5CmFyDV7YoCjRrcvoEIhyTz3XXQcMi7I+JDKUicMQ0evTDN3JEioTOHUPfNI+57BFQVMb0fbr6P893nUSfyMLsou/kLj0Fpg+CH35GOgHQOhhbC9wguX5T7eyGkHfCTX4B2A5GbhN0NxKmHEGEd6xvfQ58tEn/mEYK9PbSnPo7odRD9LmyuIx58DFZvSQtfpw1bm4jZBYIffx/xpV+DlZvSNjeUApagWZIYYd+Xa5FYQqpXfRcuv89ou44WlThdrz9Cn8xKTYDvSyb+cCjFebqOW2qgf+W3PrIfkm+WXn9OVRS8wCNlJhk4AwbukPnEDK7vIpDj/Kgepef0URWF3Z7kyMf1GGWriqnphNQQCSNORI/QHDURCMJ6iNaoS6lXZeiNMFSdmtXk9e0VQprGQrLIw1MPMfQk4rMxqlG2ymx0dvnp8jKlXo9bOyUGtk0qEiaVirO2vsftWpmt0S436qtcqezwycUHQAj2JZc4P3GExrDBSmudhBHDCzx0Vadr93l9e40r1TVOFRd4Z/cGju/QUwJ0XUPEDEYGxIoJzj96nPmlSQ4X8uzLZHho6hQIwZncKV7bfYc/efU1DkzlWUrNsdbe4deOforGqMn/fuXHnCkukQmlURVBgMcrGytMxmJEdJ0frl3i8wcfImUmMVSDUr/MFw98kslUmN/9wXeZLmR47OgB1moN/tF9nwYRcKW0xSjo8uTcAyy3VwlrIVJmkppV41Bmga9df4PffugrvLD+Ps8sPDIWSfrcbt1mPjbPyB+RDRXIhTJoikpIC1GxKrTtDqqi3ouojWgRhBD4gTzw9ZweYVUeeOujBufyFz6y9zDA7z5/57lwWEPTFFIxk8FQ7sWPzKWo9Wws22PoeOSTYTrjbPmNSg+hCMK6SmsgR+HZeIhCMsyxiQi1vouhKmRjBpXOiHJnRHfkY6gKW/UB1+/UyOeinFtIcW4ixdDzxsjbEZvtERu1Addu13Acn/WVMo7j4bo+qUyUQX9IqTmkajksV/qUGwOOTCfRFMG+TISFXIiX7zRoWD5hQ2Vge1INX+pz8YMdbAEHphJs1wfoqsJg6BGEEviRFERTJGbmmDt1FCOZJpsNM1uIkQjrDB2PwxNR3lyus7LeJBE3GXpwe7fDkyeKdIcer1wpMVeMY2oKmZiMxH33epm243NgMsFfvrbBkfk0RwphVpoDwrrKfUtJMukof/HtKywt5pifSdEfBXz6/BRRU+PizSqxuMkDi0kubXfJRHQWMiYYBrNpk5euV/jSQ3O8crPCmbkkEUPB0CQGeD4l7814WKMYNjENOJCL8OpKk91Kj5CpoWgK/b7NVC4qFf+OT6AoWLaHoSooikKlZfFbD839/PY6PB81HgY/QIRDBOU2ekpiaNF1As9HMTVJSEOK99zWAKEqqLaN0FRENoN49CnE9D6C4UAW90FH5sJPLBBs3JKCOEWVY+N0Wk4Dpudlt+s6BB9a5VZvEKwuS4GbEAxv7aFlogTVLvrQwZhIoh/bj8gXYOkQIj8tGfmeO/YPW/D+axCO/K3a/u3XJO1vbk7uzddXEDNz0loGtJsWzaHDx47mUZNhzKk0/khCgcTMLMHbr4JpEty8AqUSatRg9OLrmOksQbuJ+wf/AqfSJvKlT4E1IHj+W5BOI7I5xLHjBBvrBNUqIhbD+V9/H21pFkrbBK0W4vRDoCj0dtuEzkYIrl/DOLaP4R/9MaHPfQLxS/9QInYvvyELcSoHd64inniG4I2XIBwmeOVHiAOHCUqb0loYiiJcB3/zpgwFKszJIJxYWgbdzMzA3T3cZl/ulJNhME283TLq3DTUajjlFvriFPT7GPPFv79Ps/8HLs/3pa1qnDvv+h65cBpDMVCFih3YaIr8MbDcEZqi0h51x+NhF1M1UITCXHwWP/DZ6+/hBz6WO2ImNsl0dJKbjWV6dp+4IX9gp+JxEqbJdKxIa9TEUAyEqlC2Kry7d4Ur5TIzmRQ7zTa7uzUmJ7Ns7FWJJ6Jks0keObjEZDTDifwR8uEc7ZGkEo68EV2nx9XaMnOJCWpWk4Ez5Fa9zF6ny7npKTbabX689h6niotUB01s26Fca9Gpdjh+Zj/drtS4hAydgesyq4f5m+U3mIjFuNNY41q1xMRUlh9cvcFcYhLf93jutX9PtdLil++/n91ele/evcxSKkUmHOPJhYOU+g3eXNvgQDHP773xXfLJOAcyGUr9Pmfzp1lubjIc2cQNg6uVCg8e3sd/8/1/z3/66JP8L8/8UyzX4vm1F4kbESJamMawzf1Tx/jr228wtB2+euOHzCczvF16n5n4BGFVsgl+sPU8+XCO09mzRLQYNar446CiYBgwdIf0HYuoHsZUTXpOj7SZojVqYbkWSSPBwLXIjFPxPspXKKQRjRpoikIyolOq90kmTRQB0ZCONR7xBgHUagOmpuIMhy4tZUTE1DA1lUIyxIF8mLCuUOo6GJqCoQkm4gbFmM67m1K82B3J7PmJiTi6qqApgp7j0HdcggC2O0MurbfY3G6ztJBma0fmCUxMpeh0RqiqQiIV48ShPGFD4+OH0hQjYcoDa5wvENCyJMzn6HSc3bbNZrVHr2/T69lMTiXx/YArG00OT6e4s9dmMLBp75VBKITTSdrVBpqhEU+EiUXk67JS6jKTi7LeGDsAbI93P9jlsfvmCBsqz7+3S6tlcf7EBKXmgGtNi5lCDNcPuHCsSH/k8sqlHTKZMG/fqrBWiRA2VDZKXf7xEwtUOiOyxTSO59NoW0xPJ/jqT9d49PQUv/3FIyhC8O52j+lUmKHr8/52n2RY46/f2MT3A57/oMR0LsqtikUqrJGPGCymwvz5lV2WsiYPTGdJRXVSA4PL5Q6zmQh3t9qU632GQ49CLiInHM0BhWSIdt+m2hywNJmkazlM56J/5/3zMwt9+NA0dqmJb9loR/bh39khNJ+TIrpolMCTnW6gjMV3MxMo9Z60Xx09TvDqa4hDR8fKbxWG4y5eUWHlJsGgR1Dag0gE4boEVy/jtzqojzwCqby0qZkRGfzywesE25v4m9souQzi2HFijzxGcPPamJ+vEtxelXCXdBayE5Ib73mIucMwGshwm0wOEmmpkK+WZJc6OysjeI8elzvu/ATcvgqhEPljkxTPaiiGhigWwHFQbBtx34MwfxAlkSFYuyEFfrlt2NvG0HX8n76A2L8P7eQR9PwEpNIEyzdhehqx/wjBu2/IUfrEJCKXg3AU/SAE/a7E4eYnCdZuQipL7vFj0G4z2qximibmQoHg1k1EfhL/zVcQz3xOjrTu3oR0juBvvo545jMEF99CLB2QIKB6BSYlkIhQVIKIjNA4lEbI9+fmB1CvI1RFHuIiBmo8Ig9yQkAyibu2jVAVyR1wXcTJ03/PH2l/v1c+Ij/EB65FQZHq7YQRw/ZtQuMPf0M1sD2bpBkjpJrEjAie75MykzRHzXEAjX2PqCYQ2J7Di5tvEwC1wYCkKclvF0t7uJ7Hb576GJPRSVShkg8X2Oxt8MLGG9yu17Fsh5414ssnzmOeNnhvb/ne873t1xi6LplwkqgWYeiOKA1KnMufo2U3ea90mVw4RcKI0R528cbRto8v7sPzfe6bXCATTpILZ7nb3GEiEcdamiZx5iDpcJiJqIxl9YOATyzcz3x8DnO/ycs7rxPRQ3iBT23Qxclm+Zc//QGPHDvAI3OLpA8miBsRrlZXWUgmeXL+Pr5793XipklEN/mFoyeZTUxwMLNHc9hhLjHBhckcV+pXOVs4wq2TFYauS6nSxFBVJiayfHf5A6aiBf7V5R/zT848A8AbO5eYiGX54ysv85+f+xTvl6+zmJxm5NncbW7hBR6FcJ6wFubBiQdRhYobuBAIonqMtc4qHbszDrJxiOhyQjAaI3U/VNsrY1iSJlSm49P/H9yZP9+1MJOk0RnSs232TcRxXZ+ZYpz+yJfWT89HQ8F2PfL5KPlEiMHAQdMUHlpK8p3LZVxPZsnHTVUmxGmCas/h/bUmYVOS93RNkcVtp0M8bvLpkwVCmsByPWYTUd4vNXnpToO9So9o1GB7t8Mjp6dxTkxSbllkMmE0RaFU6TFyfU7NRlGE4Eaty3pzxIWZGLvdES/erjNfiFHq2GxVe/LQYagcWsygqQJNUQibMlXOcXzCYYPM7BTpbAzfD5icXCQe1nFcn18+P4mhKiQNg2/drBA1VI4tZSk1B9iuz7e/f5XjZ+Z58FiRoeMRNzVWbQ/DUJnKRLi8Wme71GW6GOPo/ixHJ+Pc2OvSGTjsL8a4MJ9kpTHkQCHK5lQcRQi2t1pMTiUxTWlbXIlpvLPS5IF9Gbojj+36gHhYZ7Xc5R98bJ4be32OTcVoWS53Sj1GjkclaZAMaXzxSBHPD1AE9IYu6ZCB48mMAtnjBkSjUoDX6o1QFYHrBdQ6QzRNJRrSEAKeOPh3H1h/Nhnve3/23GhHhpu4W2UUXZOWq8CRwJR2CyWbxu8NcNsDtEQEp9wCQIsZKPNziGwesXhMjurtoRSrNcfxsdYYwKOqsHJXKt5nphGn75fFSFHB6hK8+yLBiuzkleMnEekMwfJtqTBXFMTifsTsIurZc+N9+wC2VqSFrdeGVkXmzu9uwmiE//KLeNdvsPudtxmslIh/6TNySuG5BLeuE7zzFkGljFduoueTiCBAFAu4q9sokwXEufthUnbCQX2P4Nr7iG5bCvKWb2O9fQ397HHZtRumfO61ihQPNhpw4xrudgVlcU4eNLpdxMSUtK49+yuwsyJfm8Ik7GzgXb+FUsjhN9qo+TREoxAOM/ruC+hf+ALB+28ihgOCprQKomkSB3zmfnkA2lpHzC3JAJtUBmGaiHCcwB2r6VUdEYpCYZJg9Sai3xvfAAFKPoNXaxJ4AUo6iX13CyViouiKPLQYJsqZxz+yY8/L9YvP1a02Q3dEc9iWdLQxFU9TNGzfJm7Ex/AbG1M1aAylmC5hxojqklcdM2JY7pC+20dXdFrDDlEjTKnXZui62J7Hbq+L4/ucKBY5mT9MSDVJmimqwwp/euNv2O50mIjFuDC9wFImy49WbzJw+ihCcCg3zWJqkmf3XUBVAlZaO1yq3CYfjY9T89rcai5THtSBgK9deZtb1RLPv/k+2+UGv37/x3F8h51ejZc2lvnh9fepDy2Gnodp6miqSjIU4srmDsVkgsfnzhAzYoS1ENca13lh/QohTUFXNN7a3uTi9VW+/NAFpuNZ+o7FT1bv0Bh1iBkG5X6f9/buslFtcGpqls5ogO3Z6IpGSDP4zNLTNEdNOnYf23O4UV/hZqnCZCJOYzBA1TX8IOBkscgf/vTH/Lcf/1W+cedlunaLvuPQs+WqY729y2OzZ0HAa9vXuX/qCLfqm8wmisT0GIpQ6NjtcTiNTlxPkjZT9MZ7ec+Xh6C4EZfFfwxCao7a44hanZQps+0nI7Mf2XsY4C/f332uXO2jKILdWh9Nk7x6B4EfgOP6zGSlonwwdEjGTFrdEa7roxoGRyZjWI7PqakY7aFHZ5xZ3xnKkJtGZ4gypn8urzSIxQyyqRALuQiZsE7SMFht9/iT17foWw6RiEEmGWIyF+XGWoNoRKfdt8nETA5PJTg2n0QoCte22qzULAoJk5blUR24bLVGlJsWQQBvXtqlWutz7fIa3a7D6SNFupbLXqNPuTngvat7eF6ApilouoZlOYRCOndv7RFNRnjoUJaQrpAwdJ6/U+fdWxUCRcH1fG7drbOz1eCLnz7BfEH64N+5skfflz704cil0h7S6YzIZyNU6hZuEKDpGqau8vC+NJoi2G7bRA2Vd9dbDCyHaFin0ZLx0+l0GFVR+MnbW3zp4Tleul1l5PgYuspec8DAcqj1Hc7MJXl3rc2b18qc3Z+l0bNpjzyKYydFz3FxvICQphI1VfZnouz0RtQGNrqu4vsBc4UYlfaQwchjsRhjZbtDJKITNTVmM2GKMYOTU4n/ANX9X/3xc72NBqFCHLvWw+2P0NNR1KkihEK4qztojzyMaFRxSk20sIZd6hCMXMxf/KwsLpMLCM+VTPbS1rhTDkO3I21gezvQaOA1OwhFoOw/IANg5g4StKqy63SG0sfd7TD6wcuo03lpNfN9RGFC7qUVhWD1NvQ6iANHIVuU++fqrlT2Dy0pVLt+jf7VLQAyn3+M2L4idFqIhX2QSCEyWVACRLsti1yzg13poEUNhAgQmoZ4+Gnp62yUwBog9h8l+OA93Dffw+9ZmIcXUD736wQvfA+aDbxKnaDeQsHH2djDbfTA91HVQJIDNQ2W7yCmZ+DmRUhm5KSisovIF1E+/mmC119G/9jDiJNnYWMNFAX9i19CzB8muPgG4r6PIQikuPDgMcTCQTlluHkZcfysJA6qUq3KlXdk0t+HlsRwHBQNEU0gFvYj1m/CaCgPFeEwwhpIdXI0gt/uouiqdFkUigStJsrDn/7Ifki+vPvyc5Y7IqSZ9Ow+A3dINpwirIVQFZXmqMWB5D7qwzqVfp2e02fkyfzqQ+kDJI0kbuDSGDZYbW+y2dnl7d07ZCJRyv0ms/EcdxpVOqMRtVaXXDzG8fwclmexP7nErdZtCuEClteha/fxfJ9vvXWR/VM55pNpHN8jG4lR7rXwA5cfrV+hM+rx7OJDnMofJhNKs93b5Ub9Lu1Rj+lYgTd3Vrl09S6JeIR/8tRTjEIBQ6/Lsdx+JmM5JmMxKnYXa2QTDZn0B0NKe3WSyRhO4JOLRvjk4mMYisFGb5OBa/GJhQd5eesyP7lxk37P4ukzR/nKwU/xe69/m2Iyzka9gS9AEYKVUpV2d4Bljah7Q0KaRqnf51a9RCYc4u3SFfrOgJc27rDTa/D43Cl+9dgn+Yvrr/Kpw8d5dPYI290qluvyOx//T5iKTvFn137K5w49hCogG07w6MxpJmKycF8sX+eh6ZPMxmeYjU9gKDp/cOkbNEYlVEU+p6SRQlVUNEUnE8rSHNXxAg9TNdEUla7Txw98QlqYvtO/xz5IGAmG3oj5+NJH9h4GqbqvVPukUmEa886bwgAAIABJREFUjQHDoUTcTmej+H5AvTviU8dzbLds9ip9VE2h0ZSxsL/y4LTkpxsqjYHLzVKf1UqPG1stfKA1ts+1WkNGIxfPCzAMjclsBENXOZKLsduzmIlHKA1dyk2Lft9mfb3J/vk0oZBGpWmRiBoYmsJu0+LqWpO25XB6Ic0DC0l6to+uCW7udGj3bR4+kONOqcvmWhXDNDh3fgGhaQwcyW6fSEdAEQyG0nqnaQq1SptBd4CiaeQKCRamEnxsKUVIU3h5rU0yrPL5MxO8sdLg7lqTXsfik48fIBM1+MZP7zJRiKHqKq7rySCYgUurZRGJ6DiuRzxuSj3AyCUe1nn1ZpWm5fL2lRKVvs3JuTRPHsnywvslTh8psDCVoNWXMb+/8/mjVC2by6tNHjqYw/FhOhPh6FSCztAlYmqMXJ9HjuSJmSpHJiMEwFdfWudmfUB16GEYUIyYKEJa0efiYbb7Nr2hSzYRkiI/L8D3A5IRk2Z/hKlraKrC/nyE5ZrFJw7lfv5CP/iD339ODQKczhAjF8PujjDSEVQViEQY3VhHT4UIGk26qzX+T+beK9bSNDvPe/7875zOPjlWzlVdVd09Pd3TE3p6AoNGDEOKIk1aNmBakG98Z/iqrnxlwDeGTRkibQVLlCwOwxDkzLCHEzpXV1dVVz4575z3/nPyxVds3lA0REhwH6CwgVM4hXPO/utb31rrfZ837k2QDY3M1VWki1cFgCWJYfsxtGokB3vEzzYIPn6I4k2Inj5DnqoIEMtgiFydQpqeEcIxw0Qy0mCkoVsnOdiBZhPt5lVot5EWlpFOnHsuQBuQ7O+Kom9NkOwJeK4IbhkPxec9h6Reg1wOLSUR2z7+wx20gsD0hh/eQfbGsP4U+4PHaJWccA10x4RDB1mRUG9eQ/riN0T++6gvFPqDHsmT+0gXr6KcOYWsSzjvP2T/n/5bym++DP0+3n4LfaYI5TKSY6Ndv4T6wjWk6jTSpWuiw87nIFcQvH7PgX5fTBDkCO7dRpqqwME+kpQQPdtElhHBq48/Rlo9KaYblWmxh+93xITi2SdIX/oWUjYPjQMoTpE8+hh6PTFBGHQFCS/wkPIVkJ4T8CYdOD5C0lQIAoJGH9nQkDUFPB/Z1MT3XCoiaRryS1/7zB6SPzz84a0oiQmiECQJK3AomwWQIKWa1KwmRSPPwBtyNG7ihB6GolEy88ymZzi2jjme1HnW22GzV2O91+Hx9iGPjo9RTI31boulfB5TVRn5PicrZc5VVnihehU/9pnPzCMh8aDzlJY9pu+63FhbomFNuDl3llfmX8AOLOqTPoejAd84eQ2IaNodNgf7fH/3Dj13xOXpE0x8hyedY7K6DqbGcDDh3v4+U6U8O90eH9W2qVltPq4dsrl1TGWqgOP5OLaLYepM5bO8srTGb5z/+5SNCm2njRVYtO0ef7jxLi8vnOZrZ67jKQ7f/+gR//RH3+W3Xn+Vx+0m9WaPdDaFqaokisQLi/N84/x1iqbGjdkzBLHNdCbDbHaKD4/36LsOuqpy2OoiaQH/4sGPOTVV4WmnyWwmxzubG+iGxpPuFg37kC+vXsYJXVZyCwRJQN8dUTDyPO5u8triS8xlZtkabrOYXeBR9xlNq8/n5i+y0d8jIqBu15nPzKMrBjIyk2DMwBtgPs8taNtdTFVHkRXc0EWTNYznxV6WZJaza5/ZZxjgf/nexi0Ay/Ipl9PYdkA2KyA5S5U0j3d6ZHMp2mOPRsvCcUMMQ2VtsUAxq7PX82iMfXbbFkftCfXGmO1nNTo9hziRKBRMclkd1xPkvNOrZb54psS1mTx9L2Axm2bgBTyqT3CjGMNQkWWZVErjzQtTXF3OczTwOGiM6Q89Xjo/Q9rUMDSZ27t9Hh8O6Ux8lqYyeEHEs9qIen2MqmlYI4uD/Q7VmQLdrk237zByAo6eB9LIqkoUJQR+SL6U4/TJCpVSil+/OY8my1hhxMSPWK9P+P79BqtzOT53vsoklrhz/4gP7+7ztS+e5sl2j27Xplg0ieOEXE5nupLmCxeqyIrKdMFEe65sz5oqe7URjheSTmv0+w6yrvCjB01mpzP0xh6GpvDgSRNVV3jUmtCZBHzt0jRHQ5+CqaLKEo2xz3I5RWPkcWIqzVJBpz4OmM8bDJyQSJb5wpkKR30XO4SjscNSzsTQFLKGiqLEbLZtgudZ8+2hg6mrqIqwPqYNBVNXmMnrZA2FV1fL//FiPElXUBQDOYix6iNSeRM1nxLj5ono8IljYi8gjhMhjIti/MM2hjUSivb6AbQbJM9Rqcr1F1DSWRj1UY6PRQSqoiDPzSB97gtIq+fFiD+KRMfpWkKI53nEwzHK7KIo8HEkfOOBT+I4wks+PYeUy5McHwjoS78P2SzJ+hMoFGA8xvpok/HAxXVD4gSK43XKP/c5tK9+k+TBx4StAZIsYT06Ivub30LXnyApLdy9NuqlMZJrkfz0u3DyAhzskGysE9Y69P/1W2TKadKvXCT93/8TTn7yEezswMIC6fMXSRo1pPkFlKs3xc5cVYX7QFbEiD5zRlyMttZhfx9mZlALBWg0SMKIuNMnCWLUfFNAbGZnSRwb7+2PMf/bF8S04rt/gHT1BeEzVUWyIA8/EAX9xDkIPKRMDmYXYOkUbD2C0jIM2iTlGSHK01PIX/4lkoM9sQKJIpIoRsmlibpiLSNpqhAg1utI127+pzrL/rN8OKFH3sjghcFzbryJKis4oUtOi7ACh0kwQZM1dEXFVA2iJGbkWQz9IR2nx6PONj3HoW3b5HSdb928SsHMUZ/0uHdcY/hcuDlfzPPFpReYz8zTdtpUU1Wc0EZCIqOlsIOAWqfPN05epfB8XfAXez9l7IuvXyuJHVsUx3TcMU4YfsqV/9H+I5bzBXquy/ZujXqrRxAKjrjtePzSazd4ae4SPz28z4OtA6amChwftviFV17gg4NDup0hn6zvsVYq0XJa/PneD7k0dYa94RHvH+9w1Orx3r0/plDI8MaVc/zP3/5H/PToHnfqh5ybmuKN1Qts9I44WZpnLlOlbrUJ45CSmWcqNcXnF3JYgUXRKJA3NjjsDXj9xJoQ4DWbeK7PerON63jcTe0BUDAMrs+u8W/v3+a3blZIqQb/00/+H7585jRTqSK1SZNn3Tp28B5e5PP5hev0vD5e5HN9do3V3Aq1SYuCXmASTKjZR1SMKQp6iYuly4yes+7DOECWpE/jaWNiwkR0im7kkVHT/788m/8xH4ahPB+tS3S7Nvm8SS6tM3FDOhNfCO+cgDCK8f2IUkmw0jtDh82WSRjH7DbHuG5IEMTMzuY4e7KCLEmMHJ9abcz0dIZCwUTXFa6vFJhK66x3J5RSKk4YYYcinrXbdXCcgF/+0hqaLLHTdXl0OMR2A6ZKIhY3rctsN2y2jn2KOVFY4zhhsz4in9IYDj0aR20CLxBo7nSBo0yKb375NKW0xmZjzMFeh5nFKWzLY34lx/aWizW2efjQ5fLlee4cj7m71+fKcpHbmx1sOyAIYn783i7FUoZKJc03vniK5sDh3rM2n78yRzmjcn9/wOm5PPnnuN+BE3F2NoOpyRiKxKO6zWrZ5L6uYNsBn78yx0/vHrN9MMB1Q0xTpdEQ681yOUMmpXF5qcAfv3tAOWuwVDT43T/f4OLZKpcW87hBzJO9HkMry8dxwnQhRVxNiJLkUz69G0SYqkR95FOzHJZkcdl4dXmKo6HHfs9j4gX4fkQurdMcOCiyREpX0RUZy4vJGP9hwuP/JzAnGjpIqoJCQvr8PEFnjFoSHOF4OEFLq0KdL8P4eMiwbVO+eRLp9HlBo+s0hep9f0/8o7YNzboowuk0zIg1gLS8Cpm88JlLsghkscdw710xjlcU5FOnSXY2RYfebpAcH4q0uqlpsCeCQDcaIF24KhLapqeRpufg6EAo+T0PrWCSKmcIeha+H7Hw9atIq2skjz4BXcdbP0DSVfRqnvDpFkFrQNCZkH3jutAlpLPi4jG3ItT/Zy8iSz6Zion+898kunMX+fQppGwe+Vv/BeiKmDr0e0iFkijuSyf+2g/aqomL0+EO8Q/+DPlzryEtLIoVQTYHqiJWGoaOnEsT7ByjnFzF+/AB2te/Tv9PfkR6JgUbT4Uf/swFkr0tJGKwbaRi+a9tcZmsuEANeyLgpzoH6w9EiqAzQco+F3NIMhRyYtVi28TDCcpMheC4i5zWkQApl4PJBGluHvnSq5/Zbuid+ju3ZMQBaQcuc9kqfiREWrqiEcQBhqIRE6PICvVJm54zYaU4y1J2EUjouj0kKcEOAjRZZn84ZL3TpOe6VLMZVgoFoiThYnWBqVQJK5igSCpu5DD2J/zh1l+gKyqynHBtYYUf7j7l5uwpHnW2OBoNmc0WWMpXaVlDPqrtESUxP3fyVbYGR1yqzjGfm+JJp07JNLGDAD+JWVio0u+PCaOIX/3Ky2Q0ne88+ZiUpnHU7CLLMmZKZ6vZIQwjwjDiV195iYXsDJAQJiHny2dJqQavL14j0SY4msQ//vybfG/zES8tnmY2U+a/uvTLTMIhVuDwrNuiaBgkJKzkF0kQ/P8/2/4IL7LY7B3x7x/d5strZ3lx4SR+FDCfK+OELsuVEh3L4tz8DM+O6lxbWeSjjT2+denz/Lv3fsq5pRnePdpAUmRuzJ3gB7uP0ZWYvGFQTZdRJIWu0yevZ5BlmdqkzdPeJhcrp7nbeszp4hpNu0XZLGMoBlESkdOz9L3+c43D6FPmvYSELMnktJyIIlZNlrOf7dH97713cGs49MSuWpU5s1KiN/Yo5wx0VcZPErxQaBIkRaLdtmg3R0xVc6xWMyCJFLupgkmQJPh+iOUEHDfGtNs2s7NZ0qYQt63O5KhkdCHcMxWCOOFp2+InGz3iJEHVZKqVNPe3uuSyJk+PhzRbFtOVNLmUxsQJ2TgaUMqZfPvFeTZaNtWCycmZHI+2e8iajOuGJJJMsZJn5AC6yeXrq0iSxDv3aiQSDHo2cZxgj20GA4epaoFiKc2br6yyMiUS32aLJpdn00wVUnzj8jSPmxMUReXzV+bpjj3eOF/hxlKBn706z5PWhIEd0BsLLLDlx8zldZwgpjnyeedpi64T0Rw6/OWdY5bm89w8M0UCLM3ksIKISycrrO/0mJvLsbcrXjc3O5w/UeH9uwfMzOT5YL2NaapcWyvz3Xf3iRSJE3MFFFlGliUGlo+iKjTHPv2Jxyf7A87M53l4MOD6SoF3d4fk0hJpRcUPY5bywubXtwTcyA9jHDckk9IIoph8Wueo7zBXMHlt7W/u6P/WQh/+wf91yx04REFEq2VTPlWFGLGrzuWQggC5mCPqC6V9Y3/IzHwOfW1OgFWyeRgPRHHzPBgOiXsDpCSG1VWh5PZ9QVw7ewlp8ZSg2AW+uBzoKSiWSdYfCcFaKgX1OpIMyfoz7NvPCLcPULUY7/ZD1FMrSFVh95JiMRFIdrcFrMb3BbI1ncY/6lD8wiXKZ2ZIRmOkU2dg2AfbZvjokOzJabxanySMSX3uMvp8WWBxcwURQlOsiNCdoz2S+3cIPnmGkk/DwT7Kr/8jkvWHwhq381hoA9pNoUkIPLAmsP5YJNhNRiKkp1VHSmeQLl4h+eh92N8VgsN+j/DpFrKp4e020G5cQVmcw/7B+xw8bTN1cR7p6JjooI6aM8SUwJ4INfzqabHCOHURimXBD3gu0iOdFe+LmRK+/qNdwRHIF4QoLwqQilOw/wyaTaKJK/DHY0ewwatlIaYMAqRCAfnGVz6zh+Q79bdvBXGAHXp0HYuVwjSGogMJGS3N+Hk3b4U2TuBRm/SZSueYz0yjSDJdr8fItxh6NmEc07QsbNdH01RenF/ADnxUWUaRZV6dv8ZybhlFUnAjlyiJmM3MUDQz/OTgET3XJaerDD0PU5P4/rMnPNo4YKvZQk5J3Ns74sXlRc6UF3EilzgJqaQKvH24wXQ6zdDz6DoOhqHRbPb58vULvHTuBG4YslqYoe2MaIzG1Otd5heqtFt9FEXh2zdfJNHhZ0+8hiRBxSwzm5mh5/b4uPmIP92+wzuPN7i+usQHxzv89o2v8dbeHb609DLv1N8njCN2Bw1GnockJRyPe9ypb1FKpWjbPdaK0/TdMXk9xTdP3+BuY4u3Np9i6Ao9d8JWp8s4ENGaL8wucrJa5U9vf8LG033On52hn/h8tLFDuZgjBuLEZ+L73Jw7zU6/yc3Zc5RTRVKaSCFsO33KZp6uM2Q2M0XeyHy6elnOz2MoBk7kUDIqtJwGXuTjRC45PYsbeRiKLih5kkKcJJiqwWru5Gf2GQb4lx8d3Ro/F9d1mkPm54sYmowXxJRzBpYX0epYZDM6YZRwfNChOlvk3EqJYlrDCWLCKKHRF1CbIIiw7RBVVThzooQfxmiqjKmrXF3KU82K8XCcgBsknJ3KEEnw03s1/CAmjGKKeRM/jPnkQY1WrcvIjhg7Ic22xfkTFYpZgyc10flmTY0Pn7ZIpVR8P8KyfIrFFPubNS6/eIpLlxdRFBlTU2i0LWw7YDK0yBWyhEFIpVrg6rkqThBxbiFH1wp5dU340TfaLhvNCX/+cZ1+3+HMWpmPnzT57a+e4He+v83ltSJv7w7xgoj91oQoSojiBDeIeLQ/IJPSCaKYSt6k3rdJGSpXT1UZOQF3n7awo5ixE7Cz26czcKhUUhiGysVTU3z44R79Vp9z5+ZxYnj4uI6iqqTTGgMnAAnW5vI82OpyZbVIMa0RxUIvcdh1qORN+hOP1WqGMIH2OGDsBOTSGtMZnbEfMps36boeux0HxwuJ4wRFEZeGUlbggOM4YTZv/N0Kvft7/9st3xIWDcsKqJydJXZ8lJkKSa9PEkTIKZ2wP8FvDInjhKk3ryEtLsLMvCgG3RZJqyH28IaBNDsjsLXNplDcJwnSlRtQmRNFxncF6laS4HCT5M57f93912pE7T5S6EOSoM2VSTyfsN7FuHiC4MkW0fYujT/4MSl3SLS9R9wZoJgazM2RNFpIsmAFxoMx8rnTuPc30C6egTAg3NgjNV8kGjqopSyEEdqls/h3HhE/fECy+Qwp9oRSPV8SwTrHRyi6Iqxyl6+SfPA2/v1nJLffQzl7RiTsjUbisuF50BCWPv/DeyilHMQhkpkiGY+I338f+cWXoHZM8HgTZW6auN1FTulEgwmyNSTpdgjbE+Z+6TWo19FyBurlc8LXf/YcbG8hnTgppg6nzguNxF+tT/Z3RMpRGJC0W/DgPhIRXLiOtHyaJA4BxPuAJOx9/Z4AIE1ViLpDtJuXhYDweYStlMt9pnf07zfevRUlCUmS4IQ+i7kqADktQxiHJCRESYQfBbTtHkkSc2HqJJqiMQksCnqe7cEBB6MRq8UihqJwqlKmaJo0bQsJ6Lkub6xeZS4zT5REgrJHQkEv8me7f8kP9x5x1BuQNnSOxiM2do7J5g0K6RRq1iAIQo5aPV4+s8YPHzxjd9Ljj9/7EEuLubO3w8T1ODFVIafrbDc7IoAmCGkNx6xWK/zrH7zL65fOYwcOu7U2C4vTtJt9pmcr+H7AXCXHu483eTQ+5E5th0nYQ1NgJbfMs/4Oe4M+qbRJJZ3mcwun+M76RzzZO+bt2iNeWz7P3eYm9cmE1WKRMI7Z7HYpmCZvPXxMOZ9h4juYqooqq/zzD37KV05fouGMuPdsl0vL8+x3eiyVi3zyZJeREmIHPv2Jzbe+dJP7zTrldJovnz3L3qDP68un2Oy1mc1mgYizleXngBuLp91d2k6PsW8x9i1q4zHfffQx49ji4tQqN2au0rAbKLJCWk1jyiYtt8XQFy6KlJpiElhUzBKarBLGEZqiYiqpz7wY75+9s39rMHBJpTQmI5vzp6YZOQHThRSdkYsXRnS7Drqh0u3axInEzGyOKEkoZQ2KKZWdlsVw5DFXSYuI1WqGmARNVfCDmFbb4vWLM5woG5RMHTeM0IQYiO896XB3q0scg6JIDAYu25sNjIxBKm0QJRKBF2BbHtcuz/G97z1k5MPjx0cohs6j9RamqXH5ZIUYqDXG2HbAeDCmee9jemGKnc0GL11bxg4jup0J1ZkinWYf3dCJE5B1hY1nDQZBwmFzwt2DAc1JyPXlHD+832CqnKLVtlhdLHBhrcx33jtE1xWBoK1muLsh4D5TpRRRnFBrTliYyfKTd3dIF1J4QYShKQRRzA/+4gnVuQJJApvP6qytlOkNXEolkwd3dilX8+JyVR9y8doKA9snCGJOnaggKTJrs3n262PyeYM4Tji7VCQB2mOfo55FgkStazF2A8Zjj082O4ydgBNzeb56tkTbCkhIWM6lkSWJvYGIq02eOyzCMOZLF6bpTHxsLyJtqCiyzJtn/g5ivOD3f/eWN/ZIVTKoSYKWEoQeWVeJRg6SKhP2JujTBbpbbeIICheWxGi98HxsP+yJPPPjY5F5Ph5Dv4904ZIYJ+fzghVvpIQILY7Fn9ATBSoMwHVxbj9BViXCvgV+gF/voU4VUApZtNc+R/TwCe5hD7s5ZuZXvkSwV0MtplFSGlI2g3//Ge5+h9qDGptPWhzv9JiyeqTOzItgnt1dwrFD0BwR+yFEMebPfoXg3dvEXoC+Oot85jSSqpJMxkiui/+9H6JUisK7L8swGePc2yD15mvIlSLUj5HyeXFp0XVwHCgU2PhXbzP97S+Lz4WhGIHnC0hRiPfjD1ELaZQTK9DpIOcFBMF6eoy5Oo2332bYGNO7v0thqUA4cmA0RFpZFq/LK0K30OvCnQ/AGiLphsD9hgHJ/h5SKiU4+BeuwLBHsr8NMoKFHwUiaU83oTpL8vCO4PBns6gvv0Ty4BMhHAwCkRZYqyF/4x98Zg/JDxrv33JC97l3PGYqXSBMItJqCj8OkBACvayepj5pEycJOUN4f+Mk5l7rCcfjES3LYrvVwU1ijoYjiqbJbCZLw5pwsTrLcl4om/ve4LklL6HtdtgeHJEAmqbyaOMARVNIZ1J0LZudoyYXFue4vDDHa6fO8d2792l1BkRBxD/+2ld579kGmayA8NhxxPuPNun1Rmzv1jiqd2i0+shZg+pUkZQh8+CojqapNOpd4jhGVRX+3rWrvPVEZLbP5XO8tLBMlES4oUfTafGdj+9yeXGOtVKRruPQdcY82NjnN199DeSYp90jKqkUs5k8XhQSxDHlVIp/8Uc/4n/8xV9EIsYJAzZ7PcopEzSFP//4HqViji+cOcG723vMlgvEScLIcigWsuwcNul2h2weNalU8rS6Q57WGpyYFvjd+VweWYKh5/Kj3Q0Oxy1kOWYuO4WuaLx7tIciS6wVp/j585/DiSw2ekdY4Yj5zKyIEyYiq+XI63k6bgc/9smoaWbTMwyDEfBXaYUKdmh/5ln3/+yd/VuOE5JKaRimTi5nEIQxhq5A8tyt7IaszOXY2esRRRGplE7KVJnKGzw5HtHpO3heyM5eH5AYWz6VYgrbC5lMfG6cn2GlZOCFMW07QFdk4gRqo4DHBwMcJ0TXFWrHA4qlDIVihtHIo9+zuHltgbWVMmbW4PYHO8RRDJLMz795np/8+CnLa1XiOGH7YMDG02OskcVof1eki0oy1ROryIqCltE5ro3RdZXafgszbWKmTa5enOHe3QPmFssszOSollJomvCQd+yQO3cPqM7mWZrLcdSakEgSD+7v84UXV2j2bJpDl7mpDClTRMD6QYxpqrz9F5/wq79wg87IJYwSWh0Lw1SpzhZ49Mkh5UqWhaUS27t9pqczmKZGImukUhq7220mwwnDgYOiafR6NkeHfcyUAYrEX4UiekHMw/U2zaGLYSgkieBY1JoTwjBmbUFcTGIJehOP5iQkY6gYqsw48CnoOiuFNJ80x0zcEENT+JWX5nl3q0/a0IjihHxaozl0+eW/E+v+93/3lqYrxJaHnjFIghAlZyJn0sjFPHI+R9joETs+wdChtFRELWWQXnhJYGR9T3i7JQnJ88R+OJMRnXy7JUbMX/9FAamJY3DGItQmnRXfQLtO9OEdYedSJGRDRV2ZJx5O0E8twtQU/qMtvDtPMK6cRlFiMm++AkGAdukckqYQ7DU4/Mk67sCBOEHTFOxJiCZJ+LZP5Zufh6Mj4omN3xQ3fyVrYKzO4N15TGx7aJUs8uyMWBu8+CqS60A+j3Ji9VOFfDKeCNCM5aCEruh2T50VlwJVFWlw23u497eZ/YdfxX/vLrIzFlOGxQWhYZAk1OkSUbuPPFOF0Yiw0UM+sYJ57SzevXXMtRlk2yWV0QT847ALgKrEYmrSaooLhOuKC4aqIuk6yYP7kEkjVadJNtaF3752iDQzB70u0sIy7DyFbhPiAKkyRzLqQX2fuCkS7jjYw9tpoOZMYssR4JwoQv7mr31mD8n3m+/dipMEPw4omTlUWSGrpdEVwT3P6Vk6To8gDhj7NinNoGTmKRlF3jm6T0o16HsOaU0jlsBUVbEHHww5Ho9J6zq/fOarmIqJHdoM/CF9b0DZEF3jem+PD9Z3qBRzOEFAOpNirVJmv97mK5fPYqoqf3bnIfcPDnj14ikwNX795VcYeBPOzFbJmjoPNvZ5unmA5wekTIMEEWsQDF2GtsNvvPYKHxzsEycJnfaAVNpAVVVK5Ry317fJ5TPouspaqcTOoMffO/UFuu6Aslng62eu03Y67AwGNMcT3CgkiCIOrCGaonBjbhVZkv4Kkc/t/UMOWl3+yzdf57uPH1C3xww9jxdm56hPxrQsi/OLsxx3B1RzWQaBx3G9w4mZKc7Oz/DssM7SfBXX9cnl0mQyKQ4PmmSzKew4wtA09gYDptJpuo5DlMQYqkrRMHj/eI+srnBleondfoeh77LVO+Z0eZ7NXotTpTn2R0c86W0z8ges5JYIk5CRP2ISWJTNEh23yziYYCg6VmiLXHpJ+swX+t97b/+WLMt4XsjMTJbewOHcchEJyKd1AWc5GmKmNPoDl6XlIrmcwcWlIluha7EgAAAgAElEQVSNMW4QkUnp+EHMVDUjsu0VmWbLYjLxWZzL8/XzZdwwQVMlDvoe9VHAYlFHkiTW62P2dtqYKYNhf4Jh6mQyGv2ezdXLc7R6Nh/d3mVihVy6sois6fzyG6c56FjML1UwTZWH9/aYjERyoyRLRLKOks2T1LeQpxZ48cYSu/sD4jhh0B2TLWSxxzanz87wySdHzC9VSKVUpktp1vf6fPXKLJuNMRlD5cXL83RGHq2OTa/nUGuM8RwPDxnDUDkxlydjqgytgFxK4+B4yHjs8zNvXuDd+zU8T3DxczkDzwsZjTzyxQy9no2iyARBRK9rUy6nKRZNms0J2ZyJ74fIikx1Js/RbhPd0ClXMvhBhGUJcWJ/4DKZiBRBSZFpd2wqRZNyMYWkSnQHDmMvJGdqNDoW+axBylD55HDEwItZLZk4YcTxxKc1dDm3UODu/pBG1yKb1vGC6HkyZ/x3C7Xx/vnv3JIkiSSMkVMaiR+hV3Mi4ESWQdOIe0Nixyc9V8A4swTz80iGIQpJJotkmkjpNGSzkM8T7R3ibDbRF6YEVS2VEt184wB2N8AQalHadZLdLeJWD3llEdnUkRYXiA+OcA+6dB4cYDhjrKM++S9eActCXphDWj0Fm+uET7cIa23klE7QsxgMPFwv4lFjTF5TuPYbL1P+0gtIa6eR0mmk8RCJBCWfQjY0oqGNfvMS2s1ryFeuCSeAaSKZJsn6U/x3PsT+y9tItoX6ysvi79Np1GuXod8nqjWFDc61cD54hDZdRHJs9FNLYl3h2siGjjRVwf/4Mcq5U3gfP0WKQ5RiVnx93yKJYxRdYfL925irU0iqQjiwIE4wFkpMDvqElkdqbUY4GHSd5OiYuNlFVmXxPh0die57NBLQnkzm+dQkFNoBVUWKYzhzWZzmhQqYaaGXqO8h6eKiIi0to0wGSLkcUX+MnEuLFL43v/2ZPSTfqb99S0YiiEI0RcWLAgxV/zTyVH6ecOYEHuVUEVM1KBp5Ok6PKAmZzUwRxMIrXkqlxBit1aXfHXFirsr12WWQBHb1SXeDB+09imYaWZZY721zr3FMlCRcX5jHSiKuzMzwuNGk0xlyZ30XO4kYjWxevnSKgetyc3GJxewM3995wk6nR3M0Jp026Q8mDMeWmBrsdgg0if/u177JtVMrrBbmyBgSTdtGliU0TSGVNgj8kNfOnuTGwioXpuZxQjFiL6ey/Gj/CT949IA/+ugDxsR87eRF/MRnuVDg8vwCTWvM0PV41qrT9xzefrROqZilPZpwcWkOK/DxkhhFllnM53nr/hPOLczwdO+YkeexMl1hu9Ol0x6iqipGSucH73/C3PwUqqIwGtsYho6ua3Q7Q2RJolzOAxDEMRv1Ft3xhHTKIK1pbPR65HWd2mSCqYIfx3hhiBUEJIhXO7C4PnuBmUyFufQMuqJT0IuMgxE5LYcXuZiqiRu5AJ8GGUVJxJXKC5/ZZxjgf//Rzi1VlfGfB7V4XoRuqKQMlUJKeLSdMMZ2AsrlFL4foeuqQOZmdEpZ4c+WZAldU7Bsn6PDgShS1QzL01lAQlNl7h1OeHo4IJfWiBKJB4cjGm0LSZa5dGYKNaUzU81Qq48Z9kaiQ7d8zJRJdSaPaaqcWymhqjL3nrWpN8aMRh5hEOE260TDHpGkwfEzEiPPG//wG6SzJtOlFLEsPd+hgyzLlCoC5Xvx/AyrszlyaZ2xE5DP6qxNpXlyNOTDO/vcuX9AvpTh8xdn8CQ4t1ZmerbIcOTR7zv0Jx4TL2Rnt0e+YDIYuMzMZMX/F1Mo31MpjQf3DpidK7K33cL3IzJZg0Hfxp44xFGMrKjcv71JvpSnXE4xGnlohka/OyGOYzRdI5U2cN0Q3w/Z3+9h2wGGqWKaGq2WgB31Ry6GqdLrOwA0GhPSWZ3R2MOLYk7MZJgpGKR1hWJKYTptIikxkqLQHnm8cqJA0wrxw4gwSihkdDojj9948e8SavN//s4tJAidAFlTiIMIJWMS9SeE9R7q2iKSYxG0xyBB1BmifeXLMLcsuOuTEcnuFvS6n+7k5XIJfWEKqlWkl74kLF3duhjzG6YoPoWSIMNtbyIbquC+yzL2uw8JexbdxoSZS3PoyzOkZguicOm6IOwN+6AohM0ukeWxf68GCTRHHllTpZLScLwIrTtAmwxRp0si8laWxS68P0FbmiGxXeRinuTpU6T5BaR0Fun8VZJnDyGKUHIpzJ95A2XUg04bPI/goIGsIorfyhLKmVPEewcoGQNZU5Dm55GWlsXlR1Pxt4+RQp/YDfA3D56ni0HQHCApMkFrjD5bQFIVZFPF2W6in5hHTiK6O13MrE7uyiqMLLS8uCC5zw7RVudF8EzKFAV+elp0+KkUdLvidxyGhLvHyKW8wASn07CwCq1jcCykQllY/5oHwsoISHOLuO/dFSI8CWRNIbFslJ//jc/sIflu451bAH4cfppelzMy2IGDF/mUjRIdp4cVOMRJxMizOFlcpWKWKJtFalaTJ50mx+MxjdGYtK6xWCowUykwn8vxyvx10mqavdEhdatDXhekrMXcPHWrxUa3RTWXZX84ZGQ5vP9gg8APaXUGXH/hDCeqFZSMzlI+T0rTqI1HuJFN0TCoDYYMBxOebhxQyGWwHIdMOoWS0QijiFEQ0PNclktFjscDZFkmlzZxwxDT1PFcn1Iuw5N2g1cXL4AEry++yI8P79CYTFirVvgnX/g53tt7Ri+w6DoOm7UmpWyKnGFwbXaJs9VZ7h8dUi7nCeOYs9NVzlTmgYgoSVjfOabvCIb5070aYRCRShls7tXIZFO4jkcunyFIElRVoV7rsLYwjaQp1Otd8oUML51ZozkciwMyDNnfa3B6ZY7hRPiH8ymTalrsKkumyVa/z1KhgBuG7NRapNMmUZJQME3WCgu07S5dt0fByJPTckyed/BO5GAoJgfjY4BPEw2DOOSFqZuf2WcY4H99a/uWJEl4nhDQRVHMdDlNFCds1EYsTWVoDl0sKxB5XQlcPVFhtZJiNq+z07LZPR4xGntYdkA2azAznUVVZVbnclxbyJEA2x2H3sTD1BXShsZy2WSvYzO2fBRFZmT59HoOm89qIEnYI5srN08wPVsgmzNZnc9jexFHjQmZjIbliR30eDDBmTiCjKqZyJpGoqfBMOn1PVRdA0UiCGIcR/yM6bSOpikMBw4xEhM34NJykVxa59WTRf7obh1JklhbqXDj0gL3HzWw44R226LZsUlIWJrNcX6tjGGqrG+0ST9P/isUTIo5IWRrtCyePdin1RyhKAqHey2SOMFMm7RrXSrTBZp7NfKVEkgignc8mKAZBrqhYY1spmdLTE0XsCYeCeIi1jxqk8lncCYOQRARxYJXEEUxqZSG54eUiyn8IKLfs4gTiWxWo5gzuLqYxw5iDroOxbRK0dQYByErJZO3HjdZrGT4eLNLnAhhnqrIHDXH/Dev/808iL+10POTf39LkiRkVSZxxd5aNjVkUyPojNGXp8GxCfsWqXOLaP/1byPNPMe6Ng7EDt4akxzXhGI+JQJSKJeRXv8mkmaSdGtCfT4eCVvX3JLwk++ui32+qpJMbCb3dpFkmXZtTL5g0NjpUZzNIr/xpsDeWhPodul872OczRp61qC52UXXFAA0JKYqKaIoQQKOujZrr50RNLo4ItrcJaj1SOKYoN7F/Jk3YDwSO3ZrjFSZIrl3G6lQEE4BwxCivEIB6ewFpJU15MAm2D5ETkKkKIKaiHuV4ojE8YTboN2i85232fnpBrHlYdUG6KpM7ASkr50gqPXQ54oQJRw/a1FYLOIdtIkdn9jxMVZnkNdWSCs+T3+0w8z5GYL6ANlUCXsT1EKa9rsbaKokmAeaBr0e4X4duVqGclmsUBRFBBIFAeFeDeXaVeEoiCPEjSMR8cBbj8REQJKIbt/Bb45Qc6ZAHq8uEPeHqH//Nz+zh+T2+NktN/TRFRU39AjjCFMVQTUdZ8BMukIYh893w9O8vvB5kiTGCi02+ruMfIuJ71Hv9JkrFZ6nn0nMZrP87ImvIEkSfW/AwBvjhC6KrHCyuIQd2uwNj2nZNk4QYNkurUaPbC5Ns9NnqpTnweMdsuUsv3ThBrIEe8M+Thjy44+fsF1vAdDvj0mShEw6RRCEzFZLgEQUxdTrXV67do6ZbIE4CXl8VKfZGRBFMYPemJ974Qojz0VXVZxwzIWp0/ybp3/J5eoKYeIz9jzmsgWm82muz57klfmzHLtdttpd2hOLrKmy1WuTTZlYns/YdjF1jZ1+hz955y6333vMwHUYDidEYUwcx1w4u0Kz3f9UEFhv9VhYqNKsd5Ek6PZHTE+XqKTTpAtpfvKju+RnSozHQg0+GEzI59M8frZPFMbouopp6vQch1p3QDZtcr4686nbwSYmjGN2jpq8uHKS1fwiYRI+5yFElM0yLaeJE7k4ocP++Bgv8sloaRGNm6lihfZnPr3uzza7tyRFENM8L0KSoFQwqeZNNg8GzD4PNBnbAWvzBb790jzLRQMnjPnkeMzQ9vGDiPHYo1rNMBy6KKrMyfkCXz1Twgpi+nZIGAuxlwRU8wZdK+SoYxHHCa4b0O1MmAwtNENj2BWrzvp+i1w5z5VTU1heSKdn4zgBT581GPRtAi/AtV2SJEHTNZAk8uU8fpiQeB5ev8fZq2uoskwurbG718N1fNp1EXa2tiYy5+M4IZc1OD2V4nff2uHMUpF6x6LVtri4VsaTJJams1w5UabWc3DdkKPaCM3UeLrRYX4hj+tGuG5IGMZ0ew6f3N2lfed9kjghRCUBQschW8qDBKqm0m30iZHQDA174qIbOpN2j6n5KVIpjWw+xcbdDSTdZDKaICsK/VafdDZNb+8QPwQjZZDKGIxGLlGUkMnoz4l8MaoqM+gLN02rOWZtqYipiylDWhfgnYyh8Kxtczz02WmMOeiJ6N5czsD3IxanMlhBxG+9/DejnP921v2f/t+3ZFMjaI2RTY3Q8dGKaZIgInEDtNNLSGGIVs4gnT0rxr6yBMOOyFs/OhD7+VxWdJaOI5CqF66BmRH7+FFfqNen54Q1LEnEq2NDu0XzrYcofkj2C1eo397BdcVDvvTSCsrqEnTaSPkCyZMndN7fpvqFs4x323RrY2w7xHZCVE0mDGOCMGZiCajCS792E2+3iaKDf/8Z2kKV0aNDcl+5gaZL0O9z9Me3SfbrSKMR6lxF+NLzBSEQlCTRCU/P4fz+H5LsbJOMLbbf2aN64yRRvY1EAmGEVMjjH3WQ4gg5m4bBhFxKQVVlMgUTkgTP8mk9PEYKIqLuhNjxqV5bREkZkCTYtQHZ8/PIs9OiSCcRwXEHnYjU+UXczQZyWmfv9gGOG7K7O4DmkMJSgcRxUKbF90+rRdzpIWUz4nedyyGfWBGXmlYNlk+LsB9VRSpWoTJN8u6PhbpeSggaA2RDJXZ8tFPL4i3/2q98Zg/J9xrv3kpI8CIfO3Dxo4CSWWDgjdEVlcXsnICnkFAxy/S8Hk7ocjxp8klrj+PxSIztcxlSqkrXtplKp7k6cwpZkmnabZp2h/qkw4nSIgNvTBRH2KFLGEc8rdV5ur6PFCe8euUstx9uMrEcMmmTK1dOsVou0XPHaLLEvb1D9vYb/OJrN1g/arCz3xCpe5KEaegggWW7eH5AJmXyC2+8zPpxA5eQBwfHFAtZ+v0xr108jatA27Z4671PGE5s+nGAqnookkQ1U8SNXKqZHG27z1x2iv/j7R/ydFRjMLH58M4TXrt6jif1JhnToGc7LBXyHDa7OHFEJZOmN5wQKpBKGUxXiqiK4HE/3tjH9QLGQwtVUTh3bgVZlkVs6mGTM6eWODMtnA9RHFNr95GRuH52jd2jJrl8modPdnE8n4ntEocxyBKO47M0XSar62z1uhz0xNhZVxSKpsmN5SUyWopnvR3OlNY4njSRJJhJTzOfWWB7tI0qa2iKguU7AhTjWSxm50iSmMuVa5/ZZxjg392r30obKq2uTSaj0es55PKGYNs7AUvTOfwopjfyWJjKcNgXXflOx+Xxbo/hyPuUBqeqCv2+Qz5vcHImyySI6FkhrbHPYcdiumgycUMSJCxPgIWO62MaRx0KpRwnTlbZeXpIMuqSKpdZPr3A9HQGCehPfLbWG7iOz9nzczSO+0zqdRQzhaIpqLqKrMjEUYyqqaQLOW6+foGtrS6ptE6rY5PLm3TbQ+aXqyiKTL/vsLdRw/NjumOfphOSyxoYukIiwanFIsc9mxOzeb7/4y3sBJrNCQd37rFy4QSN5gTTVJ+H42i0GgPiRHDqG0+3wMxCuoBqGCRxQiJJeI1jnMEQL5RQdZWV0/MkiDjY7u4e06dWqVYzyLKE78d0u2NUXWV+aYqjjQMyhRyjQzENNUslZFnG9wKskUV1pgBI9HoO47FHNmswPZ1FkmB5sUg+pbPVnHCimqHvhDSGHrMFnQuVHN972iKfMciYKt2Bi2GohGHEynQORZH5pSuzf+Nz/B9G6QD2RgNJlnDHHtHY5fBI7Iatwx6Sqf614GthQYxkAl/knKezSJIEpkm8u4/1w49JdveI213R1fsu1HZFJ58tiCKfyojUuCQWNLbNDYLjDgDpl84x+ekD/CDCD2LmXlxBWZgRuNzZOZKtDZydNpYVYD3YZ+qFJRYuTLN6SgBgFq/MoagyzYHHyfNTLMxneP/f3MH44oskrTb6jYtE3QFGWid4uEEShuz8yT0GAx+jmhNRrWtnCdd3YWoGKZtDWj2JNLtAsr9D6huvo7/6IurKPOf/h3/Ag3/5Pl59gHvUo393n8mH64R9m8j2se9vE9nCD/toe8BHD5v85H6dd7a6dAcejaaN70cMOjbRxMM96ECcUPraDWRdFRem5/bDhVdPEjsBAP22xfbtQyaTgI2Bw0IlxU7LYvLRFkF3ArkcUbsPpRKxH4mpRBBAq4WUySFNzYho4H5brEAUhcQaQhiShJEQEw7EKkFOG0iqInb+uv6f7jT7z/DRmHTQZJW+OyJKYhqTyf/L3JvEWJaeZ3rPmac7nDvfiBtz5FiZWZU1kkVRIjVSanaju6VuwbANL7wQDHjntRdcGm0YhhuGVgK627DdULcFdavVLbJJkSJZZBWrilWZVZWZlRkZ83Bv3Hk+8zle/KHSxixYK9UPxCoQgYhz/3u///++931e4jS+KvoxcZYQJAGuUbgStMQUjSI53SJMEixN4/Fpm588eMrJZIIXhOhXxeLJ4DkDX3QFble3MRSdm+VNkiwhSiJ+sP+c6XSBqWvcub7B999/hKoqqKrCa/eu0cznuFaqUbFs3jk7YzCYsPB8/urxHq/c2eXmtXXqFZc0Tblza5M0TZnMFty9s0O1WuRf/8VbvHltm/2LLl/e3aLXHzOdLXlwco6ta/z03Ud4fkil5lLL57hfv80PH++x4jTI6w7XSxvsuGvsjy/4zXu3+NWta7y8tsq3/ut/yp98920Wc49PD895vnfKzx49R1UVFEXm4/1TojgRBeNyzPF5j4OjNmfPLgimHksvIE1ThpMZk/Gcfm9MEER84yv3SZKEvK5jKAoTz+f+S9dRVIVFGHJxOeC9dx+Tphlhb8Hu1grdwYR+X3w5msazbp8t1yWJE7woIkpTjoYjKrZL0cyzlm9yNm/TcKqkWcYyXhKmIWESoisak2AmkgE1C0s1kCSJnPaL4z2/KOvgZIwiSywWIZ4XMxvPCKOEi84MRZEJkxRZlnjlRo3eVUBN2VKvDD8Ktq0xmfg8fXTO+fn0swCbiR/zaXvOaBmx6hrcWC3gGCqbNYGqlSR4+nzAbLIk9EOhPXncRjMNtGqT67dX2d10abgW42XI3rMuuqmzmC042O9z626L5q3r6KYOGWxfa+JNpiy7l+SKOXRT5yfff0y1muNwv0+j7nD0vEPgBURRgiRJXDw9JPY9DMtgd9Pl+kqB9352wNeuu5i6IAY2XIvHJyNeeXmNeslmY8Pld/7L3+KTH/2cxVyw+c+Oujx7dIqdExqa44OeYLVoJswGQiAY+NA/g3AJigaTLuF0yuXFkMV0QRzF3P3qfVRVIe8Y6LootLde3MawxH4izZjtP4WcC3HI7s1VkjghTVKCQR/H0eheTimXLeI4pXOlYZjPIyxdJUpSGq5FexqyUTKu4D4JSZbiBTGmrjBahFQqNpYpEhknywjX/sWfxZ97ox/97//8W1azSG+vj6LIKLKEndPxJz7d8xnluknmBci/9DXQDQGSGXSEZ/vpE7KeKFLGzQ2yxRLZNpFu3RHt5CQRreIkERGq50egGaAbZI8EpS4+71F87RoHf/o++bLN6emM7c08RjWHnHeQmi1xux4N0a5vYEzH2F++QzqYcPj+KcWShZRlpHOfg8sFoyQmn0Jpvcj2r70AiwXSygrZ6SnxaIn1+m3mHzzHa08ob5ZovNBAX69DFCMdPEN+6S68/y4YOvhLuDgXxVLTePbP/z3Vb/4y2adPqNQsJE1BMXWyuU/3csl45JO3VLRKjtSLmM9CjiYesyThRsGi7UekGViSzHwR4xZ1FCnDuraCrMpIqyvIrZYQ3CUJ0vYu6Cr6l14Gb4m3d8FgHDBNElq2QZJkNCsm7s2m4PYDwdEl2loDuVQkvewJS16Wga6RvvceUv8S6c59QdA72YdKExZj0g8+QDZ1kuEM/ZffQBoN0NbqSLvXoHv5hb7R/9u9P/tW1XbpLcd4cYytaRQMh4yM/VGX9UKVWbjgVvkGYRqRpDHPx4e05z2eDvrMwpA0Sbm5tUp3MMHNO7zYWGceLikYOSqWS5plvNd+Smcu2tO6ovHdwyeUr26+13Za/Ohnn1DI2QzGM1brZRIJZFWhYltMA58ky7i51qQ9GPNfvPkGP/l0j8PjDpubTcaTOcPRjN75gMyL0UyN1lqd1mqNzmzGvdYKz4ZDkiTl7716lx/8/BGnFz1u3djka/dvkUgQJQl7o3Ne31znL/Y/wlQlZuGcvVGHKElwdJ3/+d/+J/7pl7/Co94Z+UqBLE2xHZMsyfD8kMlkQRREFAoOYRBBBosoIBn7bF9vMe5MQBLumNl0SaVUQFEUDEPHMHSqOYeGW2Dki/93u1RCVRTe2N5k4vt0emMWCx/d0Eh0ifFEFHvT1CkUHFJFotMeUHJzbJZLnI0nvNhsUrRMkjTiB8d77I8ueaW5w5rT4kH3MZZmkNMcDmcn4gCWxtyrvMAsmtPKrVDQC3iJxy33zhd2DwP8sz978q2Veo7DowHeMsS0TRxHJ00zzo77FEoOYZxys+mQSgI8czwKeHo2+exw4PsRK60SQRCzulpgverQm/qU8wb1vPCqf3QyEiDNKKVgazx81seyNMIoI1fM0bvoE4URkR9gWCbIKgsvFp9RkoSiKmxvlbm8nPHmG1u885M9ZqMZbtVlMV3QbQ+vospljHyOza0yfigS41ZWC7TbM4qlHHfurLK/16HfGbJ6bZ2XX9tmOPJQNJk4y9jYKPPWkz45W7gmLoZi9BMnKX/13U/40qtbTBYhnmSRpiKxkgxybo5BZ0CaZuimTiyS1sGwSeOEtesbTPtjsAoQLAS4LYnQCyVUXSVLM+qNPKWSxXAkAnHcoslsFrC+7hKGCb3zPkhgFIsk0xG94wtat3YIvAg9l0PTdaajOUkmUa/nCMOEZj2HqgkNRLu34HKwpOxa3GrYHA48Rl5Cq6jz8HxGfywOcq/tVlgECev1HDcbDscDj9976f/7Rv+5hf7if/lfv+WuFUlGC5yqQ+nOClmcojs6w4sZ1WtVZF2DOES6cVeotZMYTg9IDo5FprkuLAWSZSLdui0Ibd6SrH2GlMRQaZAdPUMqVYUIr32KlC/AbIpaynH65x9SrVkMu0tWmzaFuy2UchHp7ovCPzoeApAcnqBvrRCfdPCP+1Rv1JleTHh6PuNZf8GKobFaMFl6Ca2v7JLOlwKeUyoRH12I39Efo7sOmRfSPhrTPxpy/uCMla/dRXrpFSHIq1ZFYl1jBSZjqFSY/+kPWPl7r4LnER6cI8ky0/0eeslGLVgU6zk6h2Nq60XSZYC+WsJQoN2eYckKWZoxSxKiq+cuZ7C2VUJ17c8ARdLNF8BbiMQ+2yYbDkQwTRSBrpNrOFSLKv4Vg1lMFjLKOxXksks2niDJMtl8TjadfebT9x7uo2kZUlnE30pkgn2wdcXel2SkwQUsFmR+hKwrJJcjlK9+Ff/PvoNatL/QPvp/d/AX32rlK4x80YJfLzQwVI2cZnMy6XG91PoM5bqZ38BUTPr+gEf9M04uByiKTCXvsIgidEPjTr1OmiXEaczZbIgsZbhGnvN5n7V8hbpT5nH/mK2iS2+5oFYq8NP3H5OzLeZLn821OlubTdZLLr+ycYswjTifTYWafzDild0N3j44pN+fsLO9yuXliMvzAbPhnGLDpVjOs/AD7myvMZzOcWwTTVFo90fYjsVRf0jesVgufC57Yx7vn3LRGfA79+/wavMajmaxU6qJFr5d4mw64EZlhf/jr37K7339dXrLCU8ve8iyKKqVqiuKfZZx+OiUnVvrTCcLCsUcZBnj6QLHtel2hhAkkNdJoxjilHK1iGUZZFlG0c1xq1JFliVmYUjBMOgvlzRzObqLBbam4VYKlOsuWZJSKuTIORbj6YJa1cV18wK9LEssk5iL4YTdepUkTfmPP31AbKnUHefK/qjwfHzE/foL5HQHW7WZhGPm4ZJl5KPIEmESsp5f48Pex+T13BfeXvc//clH32qsFBlPQxazJTdvNZFlmRvrLk/3LtnaqiDLEl6U8cpajpWCztk4oD/xGAxEQRLWsQTT1DAMhTjNUBWJ6TJCviKteWGCY2rYhsrB5QzH1pjOQgxDpXsxILm6ZTfWm9SbRVZX89y4gsGcXswwDIXL7pzGisvB8YgszSg3yox7Y6LZBOJIYMTTjFSSqNYLzOchtVqOKEo/49WHUUIUpyzHMz37zRUAACAASURBVHwv5PSoz3Q05etvXsM2VCp5A1VVmPsxOUtE5BYdnQcfnvHVX7nB5XDJ+cUMWZaZj+dYOQu37DDsTQiefYhUrBMGIflSXuhLsoycm6e/fwjeFKXcIMsQIm23hpWzsByLai1P2bWuMuFTNE1hvgiplm16/SWmqeCUimj5Ik7ewWk0sCpVhr0xds7GtE2SJCPNMgpFodovly0cS+MnP3yCYVuoqkylIvgZj9tzSjmdak4nTDPmQUqcZmTAPEhQZInfeaHKv/zhETlb4/dfXv3bt+5NUyUeL1EUCXOziqSrJF6IbOlouiLAMrmcCGGZjcVJbdgjm06RDQ2l4iKpCuFZD6pV4Zdf2wK3hHTtNjRa4F8pvEP/b8R4WUbWuSR4ekKpZJBdJff4foJ/2OfyLz8he+9nYlRQbxI/PWT48AyyjDROMHfqzI8H9Po+DVvn9XUXgDhKqVVN3v83HyJf20H6+m+ALBNPlpCk6Dc2CbtTup0FAK3VHC/9998QosAkIfv4AdnTJ2BZZI8/EbS+OCb3pZtIL9wTgBtVYbp3Sa/n0fm4TXQ55ejDC3Z3iszaU9RyjiyMiSYeWyWbIE2ZxAmqJDGOE6I0o1wwsHbr6FtNtFYV6cWXBbq2sSq4/k4OaWtXvEjFElLBRXrjK+g3NnFsFS8UKvnWZhElb8PqKmF7jFK0kBQZpeDgv/0RRBH27/02lEpIO9eQVltCY2FaQjuRJqJzEcckc2EDQdfR3nyV7NkTZqdXxMIv8FJlWYiAZIVWrk7BcEizjCgV8JckS3A0m67XJ0gCBv6Ai3mPi9kMw9QxdY3+dM75eY/b1So3yxu8VL/FSq7Gays3aOXrV9G3OcIkom5VuVFeAyBMEp4cnCHLMjnHAmAwmPLJ40N+/OApPzl7hhcHNHM59s4vefzkUIwFFj4rKxW63RHdwZhcyaHYdAnCCD+MMHWdP/3uO7yxvcHvv/AlCoZBkqSEYcQr6y1mU7F/kyTh9s1N/uCbX6czn6MpKj85e8aj3gnL2Odh9wjXNBn5U7784nXebL1Ab7kkTVL2n59z2R/zfO+Uxdzj7KLP7t1NDo7a5As2i4VHGMVYps7SD0QEsmvAJETSFYq1Its7q2y36jTrJd5YbTEOfIqGzVqhgK1pXCuL9norX6Bomry+ukm9mEdTFaYLj9nCw807xHHCnVqN07MutiNUy+vVEh8+P+b5cMh/99u/xm6pxL36Gq18nqrlsuOu0V5cIksK03CKKqt4cXBFwjO5VtxlHIyZhwuyv7a7fIGXlbMYTX0URabWLKFrCvN5iCRJ6KaOHyY4psrZYMH5JORsEtKdeLTbM3xPjAo7bdGC3moVeGG9xPWVPLWixYsbRVxLoz8LUGWJmRexVTapFU0sQyVNM4aDOYEXYOUskjhhPplz9LzNJw/PGcx8oiSlXLY4PR5y9MkBhqF89ncPOgOCyQTZFofDLMvgisL54J1nbG1XuH+tSj5vkGWi7riuSeiHGI5NmqRsXFvhn/zj13i418cLYx4eDBjOA2RJojPy0DUR4bp9rUar7DCfh0Jb8PyU5XxJ96zLbOZDBuXXvkown2NaJt7cI45isiRhMV0g512obpBMBiCrkHNprDcoVwu4rkWrmccLYjLAtjUsXWWtniNOM1YaOSxLY71VoFJ1SNOUxXRB4AWkSUroh6ytFZiOZhimQb87pV53aLdn7B0MeflL12i1CtzeKaPIEi+08txdKxDFKboiMVqKuNvZQghRW2WbrZrDD56PGAyEtfYXrc+90Xt/9IffUiWJOIyxNypI+TxxeyCKvwSaoaLmTXHL/NIvQ/ec7LngoxPFpAuPZOajr9WQGg2RoDafinZxEosOQOCLnxkMRFCKosBsSta9JDgZEngxJ6dzsiyjVDKxbzTI31wVB4zFHPaekSUJua+9yuKtj1EsnU9+sM/50P+saI4nASDR8yOiRUyr6SC3u+ibdfA8ssGIsD1Gd8U8RPYCyqsFnG/+Mp1//QNymxW6/+YHhKc9rOstMZe2bcHG//c/xqzlREhNGCK3muh6RrlVxNQknj3qc/3NDebduSBZqRLx2EPRVWxDBi8VyU7Aiq5RNTU2dlz0G1fuhZUVEbVrmII2GEVCy6BqsL4LeVdkA9g5JF3DNQNyvoeqKuiKhL69Ar0eaqNM1B6S+hHJzEM2NRTHgMBHWlkVB63GqsgWePIxUrkKVk6E3ciSAPGQIdcqpI8/RYpCWAZozRLyb/yTL+xt6NvH//lbmqIgIdHK17FUk0XkMfDG2JqCqeo4mo2fBKw4TZ4M99gfdxj5PpqmMhxOCYOIW5urNHN5tt01lpFHdDXnX3EaeLHHW6d7+ElElProikp3OeF4OGI+XZKmKScXXSSg7Oa5/8IOdkHYxTrzOf3lkqUX8Ltffpnvf/wUt5TnZz9/wmQ8J16G1OolFksfVVWYDecEScJmq05nNsPNa8zDkMFiyWgwJdAkZFmmP5xy7+4O37x9jz/6zl+xs9bgj77zPUazBeVSgYZTQJGhPZ/z/Q8eU6+6QMg0CHhxtUlgyFRKBXI5iwefPOfVl29wdiE0M24xh+8F6LqGLEmYusbCD4Tw1lbJOzYVN0+96hImCa1CgbHv4WgaJTNPkiVUrCJ5zWa7uEor3yBJY5EymPhUykUuh5PP/udK1eVsOGa9WWUyW6KqCt3hlHrNBUmiM59ys1Knu5hwv3GDzcIm3z9+lzRL2C6sY6qmwOjGC5IspWjk6Ho9kVIYzikaeV4o3fvC7mGAP/z+wbckSSKOU3RdISMjy0Q6nWHqyIpIMstbGmslk/3ukvEyJE2hWLS4bI/JFSwaK0U2G3luNEQozMyLCRNYKxmcDX2O2zNqrsVgEWFoCt2Jz2TiMx3PkRWZxfkJkumgKAq37q6RZjJ+mDCZBYRhwnSy5M4r23z6yTmqptLbOyAJAuifkOUroqMrSeAvSIIQd7XBbBayiFOCICGKEmaTJXGCSOLzQ1Y3G7RaBb7z5x/Q2qrx1o+fMhl7uGWHvC2CXdqdOaenY3RdQ9UVhmOf3a0SsmlhOjaSLDE4Pmdle5XueR/dtsi5OcadHo5bIElSUt9HUlUUVUE2bXTHQlEUCiWHLMtQVZkoFkQ9RZIwNAVLV0jSjNWKQ61gYqgKSZaRSVCtOgz6C+Zj8Rlq522GgwW5ghDxmZbOeOxTKolZvefFFAoGfphwq1Wkkdf50dM+yyChVhC1qWxrLKKUKElZcS0+OBySZhAlKcWc8Qtv9J8bUyvLEmkQYdVyJDMPpVgkSzNyv/Ea0g8+IPVCSBLCzhjj/JDs/ASpVodmCykKkaZjlF5PqL1bm2IGn8Rili/J0D4R+FVVRXr59St/dyRu6pqGpMgMRz7r6zl0W2cyWGL7MXLDBt9n+e6nIho3Soh+8iGypfHhjw6pFAwKEkRhSpZlPJwuKaoKdU2lE0XY04DiSp5sOoEwRF+vo9/YJOv1GT27pHSryXvfe86bW8/p9jxyPz9AUWTsVfcqGGaGtLVDNnqGs1PHe97BqlTEQ0sSZE3BOx2i1wvcuJcRDxcoioRTydE9nlAo6pgVBymKqVVNcp7KaBywvpajsF1Fv70tDhJRJOiB9ab43XEEu7eF4DHwIOeSzUZId75EtvcAKnWkb/x9iuq3KfgBkmMLWl6ng//4GNV1UJtlkt4IpdUgPrskORtgpCnk80iTMZwewmQC58fQ2hGJgnZOQJK8UHwPkO7cxW42xWv7BV5BkqBIMiWzQJwmuHqRKLnkdmWXvZGgyfmJz/msi6M94+PeIWv5EjfLLQb+hGNnxNl4wuVszjd276NKCuNgxopTQ1d0HnQfsz/qUTAMXmnuIEsSXhwwDQK8ZSCsbtMF1zZXRUxqGDPxfYqmyXCx5LI9QNVU3FKebz98jKLIvP2jh9RXK/hhRBjFKIrC8mICroHmGER+SJqm6LrG+GrevVErc3O1weFgyKA/4f5L1/nBd99HVRX6oykfPj+hWnVpNMs0HIfz2ZhblVWGnke9UeZiMKaVzxMmCeezGWmaMZnMKRZz7Kw3aZ/3cfMOmq7y6bMTSsUc+byNqoqbm2MZxElKveJSLDpcW2+S03Vyuk5nPmcll6No2MRpzJ3qNdZza/T9AVWzwjAY8XL9Lg96j2jlK9yrFzmfTKnMili2QcmxWUQRDx4dUKu5rFVczuOEa+UyB6MR+4cXFAwB1jmeXDAJ5lzM5ziazjyaU9ALlMwS9vISVVaIU9F+3ils46jikPdFX6oqnEOlkkkYijz4KAp59Wad0/6CwdgjSlLGy5DDgbjZVwsmDddicWUn87wI348pWCpzP+Fk6LFVsRktY376fMhFd0GlZGHp4jWNk0yE3yxCNF1jOVtS3N4hDmPiOGY2CzFNlcUipN8RI9TVjRqnJ2OiMKL3yUeojXXIIE4TcsUc8wdvwVXHC9NBkiRKZYdcTvjbsyyjUrHo9ZYkUUK5UeZkT3APSBIO93uU62U2t0qkaUZ/7LHZFO3/5TLC8yJGs0CEyswClsuIcX9MbaUsFPNdwZsAuNw7xHCFYFuSJIxCnmDpoVomuqlj2iatNRdVlT+z5GmaQt7SANiqOay5BiejgFZRZ+zFbJUNPjydAdB0LS4uZkiShHXVicpZJkfPzijXy1y7UafTmVGvOvSHS85PhxiGSqFg8MnJiP7MZjoNqJQssgxaro6lyTy6kJEliVkQU8kbfO1Gmb2qw2l/8Qv3z+er7pcxYZhgvfmiYKqnKcHUh8kErewIkZfjoL+wA0Eg/OyyAqWKENpFkWDZ338FLFukt6WpyGPfewRhIEAtqkr28YdwcUJ2fEAWBGRBiFK0cIsGVj3PYuzRuL9GcDESxUaWsXZq6Ot1JkcDzC/dY3I+ZbuVY3XbJY5TPhoveLczRZVg3dDphBHXSjaOo2HdbCG9+hWktQ1xQ5dlZh+dUv+1uywPe5RtjbA95tZXNlByBmbeYH46EoV3c4tsNCDaPyNZBli31yFNWXx4wPS7PycazLFfv8n+20d8970zpr0FZt6EJKNcNii9to3eKGLt1infbFCt29y8U8Mu22i1vMDnuiVRfDd3oLn2Nyl0hk3mzcEuiCKv6kiWI4ryYgaVBlSrSFubousBYBiYWzUURye66BONFmTDEWo5j/H6XUHqcxzwFvg/eId0MBKCvMtTEVXre+L1cHNQraL8zjfJDvZFkf+CF3o/jkmylJJZIMsy4ixh5E+RJZlWro4sSSiSQitfZ+RPKZkWUZpQd8roskqSprSKBX77+gtossoiXqJIMlEa8XHvKXGasFYoEiYJ75zvcTLt8KR/TpSmWLYhTu6mjqzIzBc+K6sVLs56LKKI69UK5UqRe7vrPH12wm+9eBtvGXDj9ialUh7PD/HbU04+PBIiN1Ummvs0GmU0TaVaKvDVtbvslupEqRhvnZ50+cbr9zjYP0e1daaTBb/05j2qNRfHMem0B8zCkI1CmZE/5Wmni+cFXGtU8aKIx3sn/PzJAZIk8eYL13j/wVOev3cgsLqaQhTGrDbKbGw2cRyTer2E6+ZYqZfZ3mhSKNjk8jbLKGKtIISKrzQ3uFff4UZ5k61ii6JeYBJOKOpFRsEYCYmSWWKnuM488shpNo6hU68U0XUNQxGF59bNDXJ5m7PhmMl4ztF4zGzp86v3bzMLQ1p5F03R+PfPHnB40SXJUnJajiejT/HjgGXso0gKZbPES5WXOF+c4ycBcRp9zg76YqzFXKjQN5sFJAmCIMb3Y5aBYJ87jo7rGFTzJoNZgOvoxElKLW8gIQ4K1arNq7fr6IrM2I+p5AxGy5iTvtD1VEoW46nPpycjRouQ9kg4gFRNwVt4pGEAGSiqgmEadM4G+H7M1qZLqeZy6+4aB5+ecvtWHTtvU9i9iZN3iIMAIp/5wTMoNiBfBlVHuxr7KYpErWBi6YIDH8cZk+GUnet1/KUPiQhz2b63S77ooBsqhwcDNE3hxprLzIvodERxzeV0giDm458fcHjQR5Yl1rbqnLz1Y0aPHuBWiqzvrpAmKValglMQ3Qmn6GBYBoZjkSvm0HQN09Lx/ZhSwURVZXbWXa6tFqgXTZqudZUDn7Ba0BktYwxFpmAq7NRsgighzTIMQ8VyTCRJolAw0XWVG3e3KFfztNszZpMl7cs5g8GS3et1giCmXDDYrOV4djzi5KjPfBkRJBnfedQjSTPiRAgl7604/O79Bj/ZH7MME+L0F4+gPrfQN+420U0VBgNkQyjl4yglWy4xvvF1lLxJNhrDcCiKe2tTFKN+RwjWin8d+KKIGXPnTKS5jQZiFqwooOnQ7Qrb2OY1cWOWZcLulN5ej1wzz+x8gmmqBO0xkiTRf2dfHCKA0VtPqP+DLzH6Dz+ltFnC8xIWgyXtWSA2uCTR0nUGUcztssN4FrH+P/xXjN7ZY/atfyba1ZUKix8+xNqoMH9vj4/3xziOxrOfX7D/7inR1Kd3MaPyzTfE2EHRYLkU83NVwfv0jEf/6seCM1AwefrOKSf/7j1KrsFLjTzFZo63H3ZIoxhZU5l/fEqyCEjmAbKlYW7VMLerWDt15Nu3hcguy5A0TTAFlvMrpaoi2vQgWuqSBHaObNyDYCmeXZqKv3E2E12BMCTujUkWPtF4STxZEvRmnP7oOcFxFy4vic8uydptcMsYX3sDubUibI+WDd02NFZJvVAc2Dod0r/8jkivK5WEg+ILvHbcKjISpirebIok1MFhEnLdvYYqq4RJhCLJ3C5fZy3fpOGUeTo4wlB1arYQxQCcTi85GJ/xsHtEey5av2uFBgNvSXs641alwXaxhQQYisLJ8SW94ZR61WU6XVDI2wz7E/wg4tGjQxRZxvMC3vrgCf/Nb/4S/9dfvk1rvc55p89i4RONPUgBQ8Fs5kkvl9RXK/RHU/7gV3+dDz96zv/4J/83G4UmrXyehwenrKxW+N4Hjzg+7rC+UuXwuM3Dj55zeHDB+XmPX3/xFqaqUrPLTAKf2ysi1vnxaZv/89tvkaYZtmPyzruP+N57n7C5WqdxewVFUXj63nNMS7DPT447KIpCmqZCUV8vicLfKPNia4Vb1Sq9hfjwnYZzht6EMI0I0wgv8VEklTANkSWJol5g5I/QFR1LFRkEtqoSpSlFwyDJMkbDKbO5R5qm+F7AaDTj0afHDAcTng0GREnCWyeHGIrOqysrXGs1aOXrqLJGQc9jqxbLyOeGe41FtOTd7rsik8AskWTp39Hu/P+/NrdKaJqCrspomvjYzjIRtfor10tU8gbTZUgQJWzVHOpFi0rB5OMT8ZkpwlQyelOfvc6M550ZH+4P8MKYOMlYqzhM5wHTacDWSoHmFbFNkqB30Sf0Q5ySy2w8Yz4RB4PACxj1J9iGymK6ZP/ZJd/47Rf52dvPKZYEp345X8JiDJHADkuVVRhforplosmYN17b4Hj/kj//jx9hagqVskX7YkypWuTZkwvGR0fYbpHueZ/DRwcMu2NmkwW3bzeIooS5H+OFMY6j02/3uTgb8fCHH+JWXaIg4nzvmMuLEdLKLmprl3F/wuFPf0ZttUoURgw7fbIsI0szJFmiXC+jqAqarlGvO5RKFvMrC3OWZUyWIggrzUTUrSJLLK+eU6uoM1oKJK3r6CRphmkq5PMGuZyg/F22R4xHS8IwxvdCRp0+5yc9xv0xo5EnOtCPLhnOAzZWCly/2cQyVTRZYrueY+yL//lr10u8fzLjX719xsVgQSMvHBi/aH1uodcbLqprE7VFW4YoEmD+QgE0nag/E37qeh0KZVG4k0R44r0FWa+L1FiF5qrg2YeiSGWjPuSKZPvPyM5OBRZ2c1ccELIMRiOigUAohsMFlqViNApo5RzGWonCVoWwPWTxaZvAT3j0L36IIktEowWVus0HR+KWYMkyFVV4SV/aKjKYh9x7fZX+H/4xJyczcn//l8i+8+fg+6RRzOjZJdOxT7NoMp2F7NwooxsKqmOgG4qIgt3YJOtfQhQxf2+P0fMe5naN7etldNfmu++estK08fyYXt9nNA7on05542YZreSglkSiXtgRLXDZ0FDLeeRqWbTaDZNsNCA7PRHPfGVDiOPGfVHoF2Ohvtd08SUroJtIjU2BHV7OkF58RTzr8RhmM9RSjsO3DsVMPknRXZv6tQp6vUBw0kO9f0ccyIJAeOMtS4xYrBxUG3Cyj/LSHbKPHrB4/zmSrkOtJqx+6udOf/7O193qdRzdorvsi3AWQJFkTMVAlRVmoWh3qbKKLuvYmsk0WJCS0VuO6cznvNLYou6UAQF5cU2Tk2mPguHwH/Y+xIsivr69S9kq0ln00RSFoeeRJAmNqst87mHoGkU3Ry5vU6u5VCoF3t07YjSa4YcR/9v/821Kbp7hYIJpGpydXor3gqlAQSdJUtburTNf+tx7YZt/8ZMfMlss+YPf/DX++PHbAKRpSr8rvPTbWytcdIfUykUkCWzbwLYMzmczblXWaM97xGnK24+e4y191hsVrm2vYloGD7/3Mbtbq0znS04uekzmS5YLjxuvCgHoXx98Tk4u8a+81bqufeb5bzgFTqdTnvb6mKrKdrGFo1m0531MxcCLPebRHFVSKOgFAbKRNVzDZaPQZBzM+K2dlxgOpwRJQm82Z3u1ztFRmyQRIwvXFaOD9bU63csRv751k7VCgSAJidOUeRiSZhlplmCpFrNoxldWXmfgD3jYe4wu6zSsupi9yl/sPQzQLNvk8zp+lFwhcAVS2NKFj3owC5h6Eaoi08zrqIrE3Iso2jpzPyLLMjabBdYqDpoqrNKVosn5cEk5b/DWwzaWpfHanQaWrjKcCwFfECREYYRbc4njGM3QqK/VqTaKGLZwVHz8pMt8Mmc5X/LtP/4+hVKB5SIgDkOi0UB0cq0CmDmyOMZc2yZLM9RCkb/87iP8pc/v/sOXefC4i+fHxFHMZCgOibUb11n2+2iGhmII94dhGhwdj7m7VSZOUqbTgPOjLpqusbpWwt1YJ8syZk8/orq+wnK2JJsOSZMUwzIo377LbCwOK6quE3gBqqZiWAZJnOCWbCxbxzQFmOj8fEohb2CbKuW8wdQLcQyVeZCwCBPSDBxdQZKgYqs08xq6quCFCTfWS/S6U9I0o3s55frNJt1nz/GWIXEUo9tCpNtoVRmPFrx0u06l4qDIMssgpt9fkDM1CqZCPafRnUX8+gtV/vLTIR886TJfhtzbKqMrEqryi8V4n1vo/YNLgt6MeOah/4PfEVYDQyAMs72neOcj0iBC2tgCOy/m7t022ckh2XIh1OlH+2RPPhKQl+X8s0z27OQAqdGEcplsMhaCvCRBcnJ4HzxFr+XJ79YJw1TYzPwIreGirDWRdJVP3zpmOvaZLyJ27tVxrjeYT0P+6uNLrleFT1oGgjTl5nqBZyczbEUmnnp8f6/Pi//oLtg28WAGhoHiGFiWynwesfXGBqurDouxx/rLLfaf9HHrOSbvPCM72Mf73js8+Zc/RG8UcbcrjN4/wpv6DM4m5GSF5TKmtZankNdJAdfV6fY89j5oE4+WLPoL9HoBWVcZvH909bB9cF2RdpfLI23vkJ2fw0KEzlAsC5qg44JmkM1GAvSQxEiFCkyH4rYvK6Lgex7Bp8egqnjP2+x8/RrzJxdIqiKCe3SVeOKh2IaIyS2XYT75GxW9bsB0SHZyCIZF9O5DUBQkXSHzxOmcOBaHgi/wOpt3uFwMkZB5sXIPWZIxVB1LtWkvOkyDBZqisuo0kSWFOI3pLseM/SWzIMBUVX56/px3L57ScMq05xMuFwueDYfsjzrslkpIksTBqM/j/glhEqHJMo/3Tmi1ajg5C8cxqVSLdK4QuDfXmui6xkef7LPwfJIk5atvvkih6HB20WNwOmB9owm2JmLqgoSN1RpnR52rmb3Mo/f2+K1ffoXucox31d1K04w4SYiShPXNJjd215jNPe7d3WXv8BzbsXj45JCPusf8p48/4c9/8iHr63VK5QLvPdxjPJ6TJAk4GouFT6Pqstqo4C98bMeiN5zw9PnZZ8+2eNX23Ns/R5YlFkFI2bY5HA+oWBZ3mw32hkOWkY+tmWwWVrBUC0MxUK/GIH8dFdu0V/Bjn4E3QVeEN9ot5Xn89BhVVXi0d8K9uztctgeYloGui07SeCLCRN5vH3GzsiJ0GIZD1bbpLYcMgxGTYIKpmPys83MAkixFkRV0RWceLXDUL7ZzBOCkM6PfX6LIEv/tr2yIxqIiYWoK7x9POTubiNZ+1aaeU0nSjPEiZO5HnyWp7Z2M+fRsTDVv4ocJ3eGS+SJk5kVUKiII59npmNE8IIgScrZO+3xEbVWQDN2qi+VYzMYz4jjl1p0WTt7h4qNPSOIEMrj79TfIsoz+aRvmI9RiWQiVswxUHTPviGCb+ZRipUjcv2DrRou9szG2rSHLEnbOxlt4LPojkjihtrPJ/LJLpVnBG00IvID5dMmj4xFv/+yIg2cdaqsVynWXRx8eMO6NSdMUDIcoEO+N4uYW6XSIbuoM9/eZXSnfsyzDdEwCL2B4OcSydeZzYSecTgMsS6VctvD8mMkiZDwPaJUsHF3G0mWiJMPWZHK6jK7IrOZM0iutwVrJQlUk3JLD4bNzDFPnkwfHrN27ReiHVOsFVE0l8ISWJw5jDs+n7K4XCaKEvKXhuiajecAiTBguY1pFnW9/3KWcM9B1hULOQJElDvoem7XcL9w/n1volYKFLEvIqiLU2Isl0TIQ7drJBKOWF0hX0/ob8I1yRUzr9URLfjBAqjVFAcpfceF3dmE6Jeu0xa3QtpHKFbJOm2w4wHr1Figy0WDGcOSTehHmZhUMg/i0Q+pHuEUdx1bZeGkF0ozp0w7nnQV5RWE2C1ElibyiUNZUDs5mZGQs4pQ/e9imrmlEgznS1nWUgolUrqCVc8zmEWmSoa5WWS5jciUbkpSN9RydkwmqpbN4fw9vwDiFLgAAIABJREFUsODmb95A0hRSL8QsWmRZRqfn8ebrLRRV5uJ8Tqfv0Y0iHu6NaK7m2blRxmgJ8UfvaZewN2UyDQkOL0UbPbwSu6UJ2dEh0iuvixGFYUGhgpQvIbk1JCsHsoJk2kKgl8RQrIobvqoKumCjgaQqLN7bw1hxSWYCspB6EYqtE/ZmSLqKbKrgurC2RXZ5KQJ38vmrDaAg3bxD1j5H22mRLZfYL6wj2ZboFkRXKOAv8ErSFE1WSLIES7UJk4goEW/+OIvRZAVFUrBUiyAJ6CwG1G2Xp4MBz/pinj1YLLlVWSNOE7bcGtdKVWH3mk4ZeB5hknC7uopr2hxPxiyiiPu3tlksfLrdEWcXPWbTJSsrFRr5HB/tnzCfLalUiuQdi+3NJuPRjKd7p4RhDDmN3nCCbmpgqsiOxv5JG0wVRZH5+X9+iNPIc9Ybsuuu4vshNbtApVrE84SwbMMtcnTSoVZzRQzr9Q2OTjqoqsJPP9ljNl3y5iu3iJJE/HzNRZZlnu6f8eKXb6HIEr3hhIvLASxj9g/P2dposrPRpFB0iKKY7mBM4Id4fkCvO+L0uIMXRbTnc5I05Ww65de3bjPwJ1iqSU7L4Rou67m1z4rrqtPCj32iNKRoFKlaLo4mDo+bxSKmoXN02KZScwmiWOTaT+bstOqcnndRNZXmSoWKbdPMVTgc90izjLWCSytfR5FkWrkWneUled1hGs7YKDRRJRU/9gmSAEVW/m42599iFfIGmqYwmoe0p6EISUkyMkT7vlg0aVUdajmNRZgyWYaoisRFe0anMyfLYDhcslHLMfFCVEWm4lqYpspg7OF5EUEQc3uzJFTsvQXjqU9ztSSKYLvP5UcfkSYppm3SaOT49NE5/tJHdmu4NZdyo0z7Ykx/77kYNZp54jiG+ubV55WFP54KGp1uMvjoA7AKjEcLrrWKTMaida3rigDyFPJsbJXpPT9AL7poukpxpcZk/xnL2ZKnjy+IwoiV9SpJkjEdzXGrLqquMjnYJ7+1y+T8QhAl955AEjPoDGi8cAun6KCbOlmWsZgsCPyAeHDJsDeh3+6TJCnLZYgkwWwWYJkqyyBmrWKjKRI5XeFa1aSW04iSjEZOx4sTJmGEpkifZaxEcfqZJ34ynFIoFUSgjR8w6M+oNErE8znLhU+5VqDkmlTzIqBsugwp5Ax2mnmGi5hlmHI6Dtht5pksQ25ulVBkicE8IIwTtM+x131uoddeuo19rYGx3YCjPaR8DkVThK965qFV8yTDKWzdhNmA7OMH4gfr9c9ibLEsKNfEbTNLyc7ELZNqVdweTVMU/bNTwZIvlcXvnyyRDY2te030VglpdYWkP2L06ALVtSnXbHK3VvAvxujNIufnC7pRRJSlGLrCKE5Yr1gkWcY4TjBkmUEck2ZQUGT02zvQOUNqNskuO8yP+uQcjVv/8EVGP37MbBrSPp3w8MfH+H7C2v1VnJe2+ODjLqZrMf7ojCxJkTQFc6tKYbfGa7//MvPunHxOEw6SNEWXZBZJKhwE3TmHPzvGujqFLQZLLEvl8c/OiEdL0valeB5OHunefbLzE+FQUDThO5Vksu4p2WIsmAH9c9FFkRUBLVrdFrf+KES6/zoAatHi7KeHxDOP/YMpDz/p4u13UXIG6lYL+fZNCAKy/aci4tbzxIFBN4R9EcRreBU3HA9noCikE8EW+KIX+hulbbbdNW6Wdul5PUzVJMlSojRi6I+p2WWG/piSUWLgD3jn/Ihl5HO/0aBoiTfchlskr9uM/Cljf8blYkpetyibJrYqQC0fdc84nYxIs+zKm+7jewGVSpHru2vkCzbr1RL7F10uOgNK5QK2adBsVhgMptiOybQ/ZdmbgRcTxwnh5Zy1Vg1FFm1B2zQIhkvQZLIMbq42OJtd8tLaKqfTIUeHbUqlPF965RZ/9sP3CcKI0/Mu7//8UwI/5N7dHV6/vcOzvVMs2+Dp0QWz6RJNU9hp1VldrfKPf+tNhsMpuq4RRTFJGIsbfk/shaUXcHJyiWGIWf1svsQ0dB59ckAQRvSGE2q2jaPrvNJc48POIdvFVRxNKKxlJE5mpyxjjzRLOZoeYakWqqyRZRlNu0GUxARxxG/vvIHtmFQqRR4/OaLXHTFf+jx89ykfPz1mvVWnlHe4UxM3zu5ieDVWGRKnMbZqEaUxcZpgqSaWaqLKKmN/hq5oxFn8d7Yv/7br9Z0yrZU8v3GnxqedBStXEatpBnM/EkK6RUirYPDgfM7ToxGKLLG2KsR7jqOxtSX89ws/Jk5SvDBmvZojCBIMQyWX03l0OGQ8CygVTXRdIY5T5pM5hYrL6sv3sfM25UqOk5MxSZyQKzpYjiWCXiZXuiFvKr7CpcgzGVxQXF8nSxLwZ6hODkZtyFLUXI7tnaqYSW+WKDg650cdDNti91aLB99/H0yHKBTz9v+XuvfssSTN7vx+4c2NG9ffvOnLV3dXd0+bMSR7tTsSRS1XlBaiQAErQBAgfRJ+EWEBLSDplVYSQIhL0Q85tqd72lR12fTmehPeh1482cVXM9CstJrep1BAVSKRN/NGZJxz/udvNtczDr/7IdsHA/yXX6FqKtcXc1RVxnIs3HaDzqDDO//R90iihMbWkCpJxPDjTckvX4lCG6VkiVATFOs5RV6A28N/9jl1VeNtYhRFRtdVhkOH2Txi1LGJM8HnkCS49nLirEKWJF4tE+oabFVBVyXu9EyCtCQvK+7vtkQDoqlML6Zcnc1J5jOWP/sBVydj3NGQbq/JvdtdVuuEZZBSVjVxWmAZKpswIytK1lGGn5SvOQJXi5CmpbHwUvKiIs5/OdfkV4fa/Ot/+cdVmKD80b+g/vgn4LpkLy/R+0Jupx7uivjVdgsml8KxzfchiqjHE6TRCKnXg9VcFP44gtUSTk4oj86QtwbUL1+JRDjTRNrbp56MIQgolr4w3bENFMdg+dePkauK1rv7yMMBqg5KxyW7XHD1bI4f5diygq3IPA9iUegbBnEqzGjiqiKtKyxZxi8rdvwNxdMjpDRGrguMjo0+dJEaNkpVMD5Z47oGu9s2T088trcdfvAnj/nwYQ+100A1VIyhi9IQhj7J8YwqTLC2W3hjnywrWSUFm7Lkd+52+VefXOIWYGgydVXT3nHRDJUXr1a889EhdVmhH45EGp7dAG8t2PTtDvhrpNEh9asvxR02uRQt2vkRqIpoDHQDkhCSCKnVE3DZ0y9RmhbOwCGfeTwZ+1Q1+MsUo66xTFDu30MyLXjxnPTJCZK3Qb57F7p9odGvazg/EdexrqmXa+RBjzoIoSqRLAv5+//FN1aD/NPpj/+4qEq+t/U9vlx+ia1anHqXbDX6HK3PudsW3vTTeMoXs+d0LJNVErNKEqZ+wF5LmLscrSe0DBM/SzharfjqakxQ5LQtiy+vJ2y7TVzTZGA3eLZYECaCJW2aOiBhNyw++fwFzabN7ds77HfaFBJ0Ww7jyZLj0zG1IlErMqgy1TyGqqa2VYqyoixK8iAVJFZbJV8nJErNdeDjlzmbLKPdbqJpKt/e2yVWYDpb0+u4qIrC2fE1B7dG/Mmf/4z79/YYbfdIkoyH+yPisiCrSuazNUlRsL3T5/x8imUa+HMfvIz3//EjPv2TTwk04fvgNExGW10kJK7GC773W48osgK35eBaJi3TZJVEdEyT3eaQo/U5h+4uT5bPsTWTc/+aWqo496+oEYXYVm2yKiOtUgaW4ET88Ew8zLe2uiRJxtnRNcgSq41PUVXIiszvPfwWlqYzjdb85ePnrOOEu4MBO86Q2+4tZEkiKRJcw0UC/DxkZG9R1AVhEdE2Wtxu3vvG3sMA/8OPz/84Tgv++Vtb/OWzOa6tsQlSdF1lvoq4tdVEkiRmUcF4HWMaKpsgww8ykqSg3TZRFInTK59RzyZOCybTkKcvZgjBhvQ6Da3Z0FFkmeVKTPqGZaLqKlmSU1UVk8s5g1GbVsfBdQ2yoqbpmkyv5kSTsQiJKQsxgCQBSBKFaqOoKpW/ogo9sEVSZhWHBIXGcp0iKzJRUtDtu+RlTa/XQG608J98QvfufeLLUygLGoMB5188wzm4jeVYFFlB0xVTc1FUrOdrsqJma6fL5PhKqKpuyMzG/n0Wn/6USrXI10skw6J3sEue5lRlRe/ePaqywm7aWJaGaaqvw4DubDXZRDl3+xZfXoVYusLYT4nzmqmfkhQ1sgKjhkkt1cLHpaFTI3F05ZNnOd2tDlmakXkbMJvUSUQ6vSJTGzy8O8C2NFZhxpdfXBIEOcOBw7Bl8mBo07V1wqzi7W2bMIfpJgFJcGb8OOfWoMHvPej/+s542dkUdWdA/fnHryc4Y68LqjDKKS+ub/TemZg2j44E6ztNKeNc/Ls/grfeFyEpmzXF0yPqKEYZdsHzkDptkaj27AX1Yi4uimmiD13yZYj3YsLn/8djeh89wHp0SHq1ojy7Ip/5nP2fnzOdxczXCRdpRk1Np2WwLCr+s/t9LtcJUVWR1+JN94qKtqrQURWevlyznoeo9w/BdZFbTeJXU4qrOZ//3SmzLGe5TDg7D9iUIsFpr2WidhvoQ5dkEZJerkiOZigdF1lTkU2N+dMJLy59ni0jsrpmqKn85NWS7zYt7t9uAeA2dcooI1iEvPdb+6TjDYqlw86OiPOtSqFf39qBV08hCsU+vDMQnWmnJ97z3lDA/JNTAYfJCsgKdVUidbaQdZU6K9g8G3N1GXCnbVHUNW5Dpdk2kS2d+vyMej6jznOMt++gvPWA+tULgQyEnrDDvWngpP6AuqggEmhLFWeiEfgGn0t/gq1anAWnBHmIIiuYqoGExO32LlER4egNGlqDpCz4xXiMoSjMQkHSM1WVb48e8ru33qOqayZhyGyxQVZkdlsuQZbRdWzOPY/HsxnjMMBQFJqWiW0bLG4CWX7wd5/xwbfu0+40ub6a8/J6ymYT8Lc/+oL5yiONM6plAnnJ1qADdc3+uwf4c5/cT4S9LECQ02haSC2d08spQRBzp9NhYNt0bYv1yueT8Zif/OQJiRdzOV4QRDGUNUVR0u252A2ToigpipLT+ZLNOmDQaCArQpHws4+/YjJecHU5A11G32ny6afPYafBnf0Ruq7iug3iKCUIY958eMj11QKAh1sDTFXFVg06psWd9h7Pl6cYiogGvtM6QJEUtho9bNXi0N1FkzUuggtcvYV086esK7pmh6bbQJZljo+uOLuYsHe4BXmFbum4jpDy/dXpV1x4cz4bj7m3P+LDwz2ezqc4WoNxNEaWZCpqyqqgb/UxVI1luiQrczapT/rvgY5+vIrYalv8m5cLDE3oqC1LE+mRssT1KqbrGPhJjhdmnF1scGydKMqoKpFZ/s5Bh3/0zgg/zlmuE6Iox2mKyV3TZHRdwfNSrsYBmyBFlsVrFEVBGqUCbr5esH9nRJ6XLBc+42uPOIh59fgEgrV4doVrUVSHYmUrDw8o51cU8yvxvLBc8KYYO7egPSQOYtbzNY6jY1kayk263WIRcfX4GXR2WD59DO0tSEPiMAFLrBdlWSbPcjbrkNCPcRwR7FJVFSfPL2F1+Q/fl9MjXc7BdGj0e+j9Lfo7faqqwnZs3I5LnubIspAifv2+ybLETsfmi7MVAFlZc7tnoSkSHVtDVyVGLYO2pXK5yTAUmTirCLOKoqy53xfvsWmbLKcizc/a2obEF4NbZxu30+T56Yp1kDKfRziuze07XRbrGNdU8ZISU5NYBinruOTh0MLUFT79TPBj6hq85JcjVL+adf/mbfKzMdJoF+Wf/YGQ2dk6HB5SV7Ug5n2t1Q48iusFaBrlxbVIPNN1sbO/SaSTLAv1cBvp9i2Ri17XolHQdeRh/x8+P8/Jph6bRYQ1cPjWf/0ditmG9MUlxl6PfBUyP17i+Tl5VhFWFR/d6VIDL+ch/8lem9U6paUqNGSZpKrRZYmeptCzNIoasqoWkbeHt5Hu3KOYrlDbNoUf885v7/O9hz3uvzdia2gx0DSuns1JkhLFMbn+0RGTcYQ2dLHuDanjmPkkFN/zOmXQ0Hmz38BRZNKqZsfQ6bdNXp147Gw38PwMxdZpDh0KP8H9zl2Ulg3X19RRJHbydQWbFTRbgom/vIb4JsY3vtGuq6pwtDNsKDIB2Rs35DhZQbJM4pM5xg3JZeqnHDgGRVEjfY3E1DXS9i7lJhbXaO8Q6cEb4vWdlnA7s2zhdb9ZU6U3e3nLoliGgo/xDT777jYnmwvKuuSd3iNm8RxbM9FkYYUqSzK6rHPqXRLlOd4mxNI0PD8i8CMUWSYtM3RZJyoSbE1jd9jlja0BUZ6TFAWmKnKsB7aNqaroqtDfe5uQJM05ONziD3//d3h5dMnl+ZQ37+wRBDGrlS909oYOdc3ttw9AkphMV+w82mXtBciNG0JeVaM0dGjrKIrymvmeZwX3uiMeDXZYRjHbO30CP+L733+few/2ePfRHZoNG6VpcH21wPMjdrZ6nBxfc3o+wbbFZJ4UBWmS4W1CFEWh02+xvTPAdCwyL6HdbTIcdjg+n3C4v0UUJjRdm62tLpIk8caDfQxT59zzSIoCWzOJi5xFvEJXVFRZYZmsmMUL8irHz0I2qY+tWmRlhqEYBLlgWluqSUt3kSXxeFotPSxT/NwX5xNkVycvCjRVYeQ2aRsGb/YPUBUFRZJ4s3/APz54CEDH6LJO11iKSVImBHkgAok0Ibmr6pqszH4Tt+avdd7YbTNex4yaOv/hwx5BkmMYwpVNlmWSrKBji+nz68x6VZHYrCOiICHPS9K8wtZkirKi0dDodKybSV9+bQZTlhW9noWiiPWQpsnEQUzohewdDnj7w7tcnkxZTNcc3uqRJRlZkgnE1mmDrKBtHwKQbnykwT7VcgyqLtaMuiVUWZZLnuWwmlCWJZIsod8k0fl+SrNl428iRm8+oLE1YvTe+5iODc2+kOyVObZjM7+ak14cY1oGqqay2Yg9f7AOKNdzjMM3MPpbgsRc5qitDo1b9wg3Ppqh4S09JEkiiRIkWaI3bCErMstlTJYVNC0NVZUZr2NUWabj6PhpydjPUGThN58VNYYikxQVhiLj54VwsbNUVEUSYXC2MBxKItGkxOdH0NuDIqPRdpFlCVWV6TaFsqAoKgYtC9cxMDWZQUNlGRUoisTxIuZykzFbRHzw3h6urZEkOeN1/Evvn19Z6Gm3Sc8W1NcXkCbEL8dioqtrqjhH0jUkx6G+PKNeLFC/+x5cX6N0W2LaWyyoL08F015RROFwHAHTO44oGHFM7Qm4nziGqyuoKvStFjt/8CHmXhfSFHV3gPH+Q7KrJXVW0N1vcedBV8j9gOMLn6CsaKoKSVISxyV1XfNllGDIEjVQ1vDST1gXBddZxvN5SO1vqM9OKIMEtWniX26Yv1pwcRHwVz865fIq5Nv//X/A9p0O994ZcvbxOf03trgMUjZfXpKcLZj9+IjVOkUxdXo9E0WV8LyMpKowb+IgV5uU7aHNcpkSRYUosKrC0fOlQEh6rjC6abXBskVYz+6h2JUPdoTULYkF+Q7Ee5nEYtJfTqiLHBoudRpDGlNnKeUmwNhuow9dfD/joG+zv9/k1p22iN4tS2g0SP/k/0J9eFt83fVC6OcBxhc3pkdtpEfvUnzxjDJMRUOmqpRBCnt7/y8fYf9uT9fo4GcRZVWiyArTaIF6Q77y8wBFUlAkmU/Hx6zimH/y8C4/u7xk2HEpipJxEHDmXbNKN6iySteykCWJtmnSsyySoiApCoIwZhIE5GXJNAzxwphW2+F33n+IoWuM1x5v3T/g7QeHfPbVMaqq0Gk32ep36LWbIElcjBegyq/93KM4Fb7gqxSlaVBuhNrBO19RLRPS8w1HLy+J8oTHsyuiMCbPC9Yrn8+/eMXFeMEnf/Mli7XHf/eff5/79/f44Fv3+fKrE+4/2CdahZweX+MHMU+en7H2Q3RDNIVVVTNdrEmuPbSmSVXVhHHCoNsiiTOiOCWKUtI05/R8jCLLdHstepZF17IYNrpkZcm9jnjoH7p7GIpBWZf4WYipGmzZAxbJElM1yauCrMroWwPyqmCRLEjLlK5p4roNhqMufhixNerx4M4e9w53cJo2fpYxsBv8q4//nu/s7DGPIo7WV3RMF0MxuAwvCfIQV3e57d7m+eoIAEVWiIqYtMzomb3fzM35axxbl8nyknVc0LFUXhwtieOcrbYgAtc1TP2Mk2uPIMjY32/x/NWSVtumLIW17OUyZB2Lqa9hahSFcNjrdi02m+S1+9vp6Zo0LYjjnOlEEMYevH1wY8+Rcuv+Nvu3hnz+8RGKpuB2XTRdo9FsgD8n9zbQaKE5DeqNmKAxGmIYkRXwVyArVJMTUHXq2TnpRjS983lIdpPVkUQJ04sp4dkR4x/+NcnVGX/wRx8x3Olz5+07TM/HmA0Tmn0W4yVpkrJZbMguXqKoCtguaRiRximsrqAqKeKYcOWhaBpFXpClmWDmS7CZb9B1heF2h1bLFGEzpmDff3Crg6UrdC0V6wZRmQeiudp2dS42KbYmE+UVErDl6ORVxSoqOFmmOI5OWZZ0Bh3BT7BcNNPAGO1TFAVlWdNsGnz2ZML2lhierxche/0GcVbxZBLhJSWPth1+70GXz05XFEXFsGWyDkQ64RvbzV96//zqQh8E6NttWC6FTK6qkRQJFgvKr9n3Tgvp3hsQhjAeg64z/5snqJ2GcMXrDgQbP4mpnz6GJEFqdajncwHtqyrZ1BOSLc+DGytZZTSgHM+QWq6YXEej1/I+rddkc+kRrmPSrGTUMLB19fUP9ItVgONolMBd00BGQBu2LKNKUNQ1LVVBk8TPEvzZzzDu7XL8wxOSpGTrvT12dhr4ZcU6Lwl//BWba5+zZ3MWm5S//vMX6JLEp0drpidrDEPhzd8SBc+yVNK0QlFk7vUb1EDL1amBzy42LFcJu3c71DfmBnff6oMsC68CwxCFPPSRegOYXoFhCtipLEXBb3Zg+xaS20Pauy/IenkG6xn1aga6KRCBYIXSa1PFGd7xnGZTpzuwyYsKbcsVHvjb25TPXqJ1G0IhUZaisXA7kKVw9y2xq+/0REMhSyBLSN/57X9AcmazX/eZ9f/rKeqCkdMjLVO8zBdkOaOJLCkEmXC5A/ho/03Wacrxes2W4/D3Hz+h03UZ2DZto8np5oqiKjhardBkYam7TlNahkFWlqxXPnleskoSOqaYknb7HZZJIpLiXAdTVZGB4aiLYWhsvJAwSji9mjIctEUoRZSL5Lmn12z12xi6Dn2Tqq5Ak0Vz7OqgSEgjG6qaF8sZv3h2wt6gyy8+e0GaFfyjb7/F7lYPkoIszvjJ82NOjq/58skxl5M5f/UXH6M0dM6uZlxcTJEkibfevEVZVrSbjZsJQ2H/0T55mNJ2HcIgYfxyzGy5YTTq3tjwquzvDsV7XZTsNkXK3MvlGUPb5TKY0LVaeJlHVVfsO7vcb99lz9mhb/W47d6ia3So6hI/85nHM5qagyprBHmIpihIssRq4dFqNuh0mvh+RG/Q5vbOkEeDAZ9PJ6iayt+dHOPoOh2zycjeYhbP2Wlso8kqsiRTViWObrGINxw4h9iqaNqCPPjN3aD/D09R1Zi6ysRL+cV1hGVplGVNXlaU5T+QsG7vtIjjnOtrn27X4ujJKZ1eE9cxBGls7Atv+iCl0dDouyJi1TBUoT3fCLQwinJc18C0dOHuZqnCxruho6oyiiKxc2sk9sNrH1VTCSdj7HtvC6vzYEV+dQyA5raQDAOcrkAKq1JM905PFP/+HiQ+y2XMcu7TaplcHo+Jw5iDezs0Du4Iwx3d4k//9AuuTyccPT2HPCOYzpEMg3Qxw1/5qKpK6+E7mLaJot/4jNQVxp1HIMmolgXLS8rLF6TLJVt7Q5I4od1v0+q30DSF2XiNaSqoqsLST9keNHh86dFxDKZBTlZWtCyFBwOTuz3x9x/fadG2VJqGwizKON8k9GyNpqEw3ogwIqflCLOh9haq2yLfrDEsg+FOn0dvDBiPfYqi5NmLOVtbDqOuzbd2GsR5xQd7DroqY2sipKuoal48PuW/fDQkSgs0TeFs+W870bdaaD2HermC6VTskasaJOGBz+6uMG8pCvHXMCCK6H/0gGIV3rDubVEobk49mVL//GdCVqcokCSCBGdbQhLmumJKrGuUpk38+ZH4OtfXpJ8+RbY01E6D/ru7jCcR7ZZOkpachQltVSGuKhqKQhjmXGc5flmSVBUvk4ysrsnrmhKwZBlFgvmf/BRt6FJOFsRxQXerweara5Kk5KODDg1F5uj5ktPrAMfRWRclbVXhwV6TVVGw8TLMoUt0vSENU/K8QtMkTFOkKemSRF5UXKTi9TdliT8NePVkzuZyw/h0A3GM9vCWgL/MG+i90RTs9xtrU6m/i+R2kToDEfMoSaIB0k2ke+9S5xlSoylYrlEAWYL09rtEE4/GwKHIK7x1gqYrIoio0RAe+MdzkUI4Gol8eRBf+/kX8MVPwXZEI5GlArH/1j2xOkgSkSoo/+pb6Dd9LNXmoLnLOtvgZR4DW8gb0zIlK3Nc3eXUvyAtMoIwxjUMpmHIO2/fZTpZ0jEbuIbDbnNIVpZkZcnpdMHfnr1kFcdUgCxJ9AdtHMtg2Giw02xyty/u+bKsODkds4pilnHMJ0+OUFUFt+Wwu9tnfD3n1u4W87VHOg0w+zfKFkPharokufaEb0VWwiIBVYashLwSHzMUnpxc0nRtxmuxRtna6vDjT54ShDGjd/awHYsvvjzi7GpGq9mg8jLQFO4cjCiufIIoYWu7x3LhkaUCxs6yHEmS8IIIDEXs+eMC0pLl0mO59Di/nDIeL7m8nuNFMbf6YgXQMm1M1WBgC0KdqRo4msPI3qJjdGgbLdpGG1VScTUXTdZ4q/uIoi5paA38PCAuYlbJmn96531mkxVuq0EQJiyXHqoqXN5ahoGfZZycT5Akib12i+/sHKKhuw5eAAAgAElEQVRIMmVd8XRxzF+c/R1NvUlWZSRlQktv8dHOh6RlgqboVN9wjsnXZ+hofHi7w9xPmWwSsfM1VZSbhuzWVpOFnxImOUGQ0m6bLJcx27d3GV/MMXWFQctkt2eT5yWLRcT52ZpX52tkWULTBGyuaiqdjkWnY2FZGr2ehW1rTKchr55ekiQ5YZjx8uklrZZJp+ugairx5Sn2cItosQR/Ac2e2MVrJvlmTX35XBT3ugZ/Lgo8iGeNtwBDBMC0Og7LZUhZlIwORlyeTATcffAOstOivHhGsRgLv/o0hLrCaTkweUUVbNBNnSiI8Fc+pm2CJ4jg6XoDmkGRpIIIWFeQxUwupqRxyuR8QuiFeF7C7XvD1777eVlh6SrGTaBNy1IYNXXe23bYb1l0TZ2qhpau0TZVfnuvTVNXGDY0VlGBl4hchfcOOyyuF7T7bdjMKOZj1GYLwzLY23NZ+ylhKNAF1zVpNUWIzSzMSfKSP3uywNZkLjYZF554Fv+LP/oO12FCuyGsetV/W3kdmw3SGw8FRHvzQFfuHkIYovebYgLvDgRhq9u9MXZpUa4D9C1BPKsvT0VQii/86QsvFhIyTROBK3VN9GwMt25RP3smOrD1mmq2IHpygdK0iL84xv/hVxh3d0CSePGnT0hOZuzuNLgeR+RVzVDXMDQFv6zoqSqnSYqjyNiyzKoocBQhSRhpGq6i8EWYUNVweRVSBgnFOqLl6mRRhqxIlFXNi2ufbsvgKEx4dL/DcpnwvQddnscpf3u8ZEfXufeoj6QpvDgWuufWm9v0uiaNhkZZVrz99pCrjfhednWd9+91yIsaPytpH3TYfWNANvMoXp0LHX1V38jlClHoG46A6YtUdMJFQZ2EggF/YzGKJCE5LTGF2y5SZ/C6+bK3XMowRVElNFUgInKrCcslydMzJF1BUmRxLfpDkVEwvhDpeP6G+u//UpggPX+KMui8Dh6S3vs2kqa8bkS+qWedrtl39lmnYqIE6Jk9gjygqTeo6oqW0cTPIna6bRRJ4qDVIolTdnYHzOOAc3/C0fqKpBC7N98TU48qy3RNk7yquLqYcb/b5cvrCZoss04Sji6nTK4XOE2b68sZl1dz9g9HJHHK3/zdLwiCmHbP5WoqnCf1QYOyrCjnEVrXBj9D33KQNBlWGTQ0CHMa3YaIhB3HEBW8PL5CkiSiMMGxLTbrAMPQXwfadNtNqCoe3N0liGIefXgPZjEvfvYK86DN7nafsix5cXRJr9/m4NaIXqfF/nafNMv53odvslyLe03Zdrh7aweAcBly7/4eu9t9fC/i6fk1syhCRnoNjVuqga1a9K0euqJjKAamYqJKKo7mUFEhSzKqrDK0BsRFQlNz2GlsY2sWQRbTdG3WKx/bMlAUBV1T2WoLIuRXk+kNdF0zbDTYdvqMGn1OvXNUWWESevzF2d8T5CHLdIWpGORVTlImHDgHry2Rv+nncpPx5lB4qNd1TXWTmJbmFe2mIdQ8N7asnY5gjLfbJmVRcnBnRJAUzL2U83mIosioqkLohxRFRV1Dq2UQRTnLyZJWy+T6+sa+2EuZjH3CIKa71WU2Xgv/90GHydjj5U8/E+slpyN256oGbl8gkzeMe5IAdu4L8lm0FlC+rIhdfRLAZgLra6bnY5HkWVZIskQap+RJSpllkKeicLe2sHcPqIIN3fv3IY3wH38Me48w+wM0QyM/f45hGWiGhtQeoGrCW6Q52oLZqWg0egc09g4p0gwWV9x6sIdhGSyma87Plvi+cHxs2zrrMENTxX6+bamMGgZ9y6BrCOJfx9CoqekaBk1dZWAJu+2GIfP2to0iS5R1TbPTZDVdCVJho0VZlli2ju9nTCYBSZig6RrdrsXtrSaths7lOiXOChZewt8+mXK9Sfj8IsAxtdc8iv/mO7tIEmTFL38W/2rvxzSFJKEME9Q8F9DtZgO6Lgq+LItpryhE4V6vwbap0oIqL9FuYHo0XVzU62u0+4f/oK+fzajWHmrHhvkcaXubOgxgtSKbeqJZqGuU7TbGnkx2Mia9XrN7u0OyiVFVmUZDY7mOGdk6kyjju4dtfnG2wVUUNkVJCURVTf/GwOA6yzlPC2QJFkWBJMGXn024teswemeb059fsHung2YUNMYR41XCQNP4s8dT/vl/+hZ/82+ecdcy+LEX4Sgyl0crymIpYgsPeuSLgCQtKQsxKTz+coaryBS1jF+WHJ/49Lomd3cdgqsN9tDB2tuB3V2kpitkIFVJ7W2Q2l3BvL94Rb0Q0HodR0i37lNbttjRrxfU3logJ9sH0BlQz69upvCEKs1ZzmMR5OPomLf6AGSTNenMR2uayKYOi4Vo2pKY+vxUNA+37yHtHooGY+DB2ZmYLhczpPUSSZW/8Tp6TVaJikh051VOVVdUdYmtWnSMNkmZkpYZYR7R0DQWcSysU6uKwI/o7OxSVSVN3UJCsPDfurdP2xBQaJjnBGGMaRks4pi9TotJGLIMRTxtq+2QJBndXkt8znyN78fc2R+RJBm9tksYJYyPJli7HTazDQ8+vMvzF+fQ0MgWESTlTcaCBJZKOPVhnQl73KomiVOevThnb2fAG28e8rOPn3Lvzi5VVeEFEbPlBip4/IOn/O5/9RF/8a9/iLbdJL/ySRYhF9qM8lzsRVtNm6vxgiRJSdMMy9T56SdfCWZvU6OMc86uZowGHd545zbPn52xvd2j03V5s9+nqRuskhBDTfHTlJHTQkLGUi2uyzE1NXGRsNMY4WjOjZ/BihP/FFmSsVULS7WZJwsu/Qmu0SAKE8IowbFNnIbF/sEWTV3nfONxfjbFtg1MU8dPU5IiJcxivlpcUdU1h60ub/buoEoKqmqTlAlxmVBUBUkZk/17EGgDIoZ0neREcY5rCefAqqrJqdjuWEhI6KrCKhSQfBBkFEVFnokJ/937fbw4wzE18kKQ8Xb2+7RaJpomE0W52Ap1XXw/pdHQSZKcLCtJE1FkLUsj9MCyNLxNRBImNLZ3UTWVRqdNnuVk5y+EQU60ofXgTTYXV6Kwzy/ERA9ims4TMcnLipiwb+D+8y9C7OEWd9884OjpOaqh43ZdluMl0clz6O0RffUx93//n/Hib38EdksQ/WKPxJuSzFywXNr9FnGUCJg7K5BtB38yg0ZHfH6REq7WqKZFPdzn5OkZbq+N5Vjs77dF4cxKpuuYzSYlzS00ReagY1DVNes0ZxKmHLoWHUNnneakZcmxl4mUyqJCV2QuNxlJXhJnBXEYUywnSK0+im6xe2tEo6GzWsVcnVzjdl3clkVV1Sz8FMdUORr7lGXFqGtjGypD10CVJZ5dC8fDo2XCnTsNVFVGV3/53P6rC327LYq9JAkzlbwU/9c0QcjStBv42BC72rKkmi+FJvzWroisNS0xGa5Xorj7PtXVmCotqLOCfBmgOOZrGF/q9ahbLRRng9p1yK5XTD+/pDNoYOx1MfYEy1ftNrj67IqNn3Gn3+DFLGDL1PnqwmNL1zhPM1xVwStKXEVGkyQWeYkqwaas+I/bDV7EKfO8wNB1qgrWL6b0uibj0w1NR+PebZfTU5/TJOW7ey3ydcijPZfPzjf0NIW8rtnZc3ny1YI33+hSLAPqoqQ7anL2YokfF/TaBptNyqoo2LEN8hs7yiyX2f/WjlAnlCUcHcEHHwp5XBQiPXgkViGXpxAG1HkuEJQwFFK37R3qwBcfqyooJhD4QhPfFS55tech6SrdvgVVjdo0kRsW2cWMfOKR5xWtB9sinKbbpT5+KZo2xxFEyuk1UhxSv3wh1imuKwKMzs+p+32qOP/GW+DaaoOiKsirgrhIKOvydbBKU28S5CFeGiJJMuebDVVVk2U5aZrzYH+Ermi4usPR+powzzFVlTjP+fLiGsMU6XBJnKLpIqM6ynP2Xfc1i920DLIs5+xsgmXq7OwNUG7S2FRV4fj0mjBKaG632FytOHxjj5enV+iWTraOxT5eyqAGghvFQ16BIuHe7uGdLCHIya2SJE65PJ9y62DEy6NLuu0mh/tbfPX8DOKCzv0BcZLx8Nv3ePb4RLD5s5KdUY+Xx1e8985d5ksP09QZjXq8PLokywsGnRbLTUARpbSHLTZ+SJblTOYr3n/3PlmWU1U1j6czvru3i6mqrJOED0b3cHQxlXuZhyIpzOMVi3jDKvHYa44IspB5vKKoRKOx19zCy84B6Fotni2O6fZbSAuJoigxLYOOabKIYxbztQgT2R3wRq+Hqap8Pj2hqev0LIuniwVdK+bn4yeoOypblgi5kZFYpks0RUORFHRZ/83cnL/GGTg680AQ6XRNeR1o0zBVmobK8SzE0GSyrGQ+jzAMlcVcTOW2rbOOMjoNnVdXGxRFFARNU5hOA2RZYjnzKPICp+WgKBJRVNLtWjiOgJ4lCRbTDVmacXU6pb99w6W64VBMzl6Ioj08hOkp6s5dNpO5MPtKfOjuQLASxl9lIch5SQBpiHr7XYqpuOYYIvv9/GiM5VjkaU6e5gz3h0xX1wKK33uL6fVaNBJf/hx0G9IQ6/YbxNMJ7uFtQj8S3vamQer5VJ6HsbVL6sngTV9/P4WsQOQxeusNVrMVnb7L0csZD94QoTlVVfPoXg/X0jA1maYh42U5l5tcRFznFdvNnLgoGfs5UV5hqjIDR8VPBXlSlSVWYY7VsFD3D4n8CMMyUBTRrPnrAEVRcNwGu9tNgihjtolZ+BKaJuN5KSsjI85KVmHG+4ctuk2DuSciqv29nDyvMPVf7vD4qwt9ccMOv8H+td5NLn2zCWGI9K0PIPCpZxOCnzxHaZoYux30wy2hvY5CmIwFJK3f/DK1WrBcQ1VTbCIoawHJL5fQbFLP51TXE6owJbg4AaB30EaxdLLLFcZeB9nUWPz8FN1QMDQFw1BoKTfGBorCRZrxoG3xfB3TVlVWRcEkK/BK4X/fUWVWRYksQVrVVNT84GRJR1X46Nu72MMms+MVnYbGvbstpl/N+fTSozeJaKgyNXDH1HE1lcdfzcnrmlcvVuiazP6tFnlaYJoKiioxWSdoksSObVCWFVtbAspJkpJ85gsL2q+LpSRDXYnQmDyDzVIU2V5P6OsHWxAIb36SGKk3pAYk26H21khum/B/+t9p/Ld/JEh5N0Q5fehSBglazyF6ck6VFmhtG7Ouodmkup5QvjpH1hSU3S3hu58kUFWi6XjrEWSZcLba3hbXarUSkP83XF5XI+AsWZJoag3CIqKoCizFJMwjJCTm8Zqn8ylfPj5m2G8zGHboNBs0dJ155PNkNiavKjRZwLyWJiaqxXxDmmTohsbudp+T5YqmZfJ4NmPlhayWHtPpCkWWabkN3FYDbxPSdBvYps7TZ2cYmkZIwmjQIcsLNn6IhEQWZ+zf2+H8yQVyz6TapGI/72dQ1GCpeJONmPIroZE/PrmGuOCDjx7x7jt3mY6XWJbBwc6AY/+C1WTNj/3HKLKC3NCoLJVht8XzowuqrOTV8RWyLLO/OyDLckaDDrPlhulsjaRKNDoNojhl1O8iSTDstdncJJkNei2auo6tmlwFawCiIkaWJLwsoKaibbgcunts2X0UWaGsSjpmm6zKcXWHtMxoGy3+lyc/5A8ffhtZkjjzNmRpTqfnEvgRg0Gb0+lC2PYOOyiKwoNul2c3CXYAe65Lx7LYhDGJW6ArCqfeJS3dpawrmnqTvMoJsoCiKgjyX57j/U05+o1M6+ui2+1YxGlB1zFYxzmWoeBFudi9v7qiN+rR7jRuSHU6YZKz3MTEcYFlqZRljWEozOc5WZKRxikNt0HTNZnNxGrq6sojjnM2iw1lUaLpGrqhoxkiZ0BRFCxb5/TZmfD98FcYZpe0u4NmaBSJIIcZWzuk45tM+boWE36wFJO1qlOsZmJnX2TI/V3ik2cgq7B3m9sPdjh+fiU4GfffYHNyApMjPPkutbcUaABAa0A8F8Ff3tELpO4IwxJe9mQRNFpCQ2/Y4A5BktH6IxGS094lDmMMU9gMd/vN14E+klThRRmWLnbzSV5jmjIP+hZhXmCpCmUNmiwzamposkxaVji6yv/26YTfudcTXvTXQsb3dXhOb9gmTUtmY5Eu6HZdtrebHJ+uhF3uKuD23QG9tkUQpLiuQSqJPIurTUaUFtweOiyDjM/GAWlaCAOdX3J+9Y7+8hLqGq3fpA5C8oXwTEZVqWdzuL4QO+NXRzi/9VB449+/S7X2RKHwb2xSDUM0DTfIQJUU5HMPfbuN9XAkLr7jUM8XUBTI21uUQYq+1UI2NczDPtrQxf7gLlVa8OLPn3NyEdzoE1XCKKeiZme7wTwvuN+22IQ5NTXromBdlAw1Fb+s6GoyDy2DkySjoypcZgVXWU5c1SRVTeEnmLcGaJqMeWuAvtvhWwctsqqirMVrvIwzZCQ6bYOoqmibGklZcRVmyLrKapWi6QrtloFriIhB21ZxHA1vk6EoEsPDFpGfIumqeI+yTEDxIGR1g23qqXAJlEa7ggnvb4S5zXJBnSQicKauoeEg9YZQ11h3hgJpsR3Y3sa8v0uV5NRVjbLVI1xERH5Kvo6QFJns6BL5cB+1aVL4CfGnL8R10zQkt4W0ewC6SZ3ESG++LdYzh4fUG48qzYXu/xt8gjykqit6ZoeiLlknPn4e4OouRV2QVyKi9nq54Xe/9w5VWfFoOGDlh0R5zjgMCfMcRZZRZBlVlkXGfSkm6P6gTX/QJq8qOg2byXyNqaocDrrkRUnvBrIf7fTY7rXZ2+7jbQJ+9NPHXE7m4pdfUQijhCzPaTZsyqxgb28oIHdTofIzoQ21VYhL6BhIAxPWqZj4o4JiHQuI39bIspy7gx6mZWDZJlvbPd587y54ImLzzuGI6tRHkWX6/TaVnzEcdgjChNXGF/7fYUIUp7iOTdO1kSUZTVNfM/KbTRvHsYjjVMDIdc3V2mOdhtR1zXd2HrDnjDjzrjn35mw3trBUi3E45dX6nGk0Z5lsuAyuUSQZW7XpGG02mUfPsmjqDqZisu04vLW/TRKnNN0GrmGwWnpEYcL1lcgbfzKb8f3D+zdZ5iVfnFzy2fUYy9TJy5Jtp0teCkSna3REkIliMYmnhHn8uhn8Jp9PTteEWcneoEGcliyWMVlRoSoS001CmBRM5iGbdcSdN/YJNgHttkmRF1RVzWqVEEXF6/28qkrEcYGiKER+RMNtYDs3u+WGThKnqKrC9naTuqrpjXpYjkWn77J/0KXZNPDXPqeffQWbGYZloHYHIkymEEROyhLNaZCGsdjNa4aoAZop4HvdEkXXnwsI3nSozr96rbdXFIU0FYl5pm1imAZ6f0usDzdzGjt7gpBXiAhdYk84gtot6sUVqqaShSFqp4/VcgW64C9EhocsvfbtVzWVzWKDJEvIsvAeSFMheXt4q8NO1ybOSs5WCcOGRlnXnG0Sfnji8/PLgLN1wrN5zDouScuKoqqJi5Kdrk3TkNFukgKHoxaqJtIGLUvDW/nEYUyWikbL81Leut8nSXJM2+TV8wkvjpYYhkqaFqiyjG2ojFcRHx62GK8T9rsWPz1aoSiSaGp+yfnVE72qikLTaVNcCtKLpEiiiAMMt0HVkJoOFIWY8L4mZ3U6onjJMpLrirzz9Zo6zVC7DmqvKaZ8SaL2fPL5GP39N0VwzaePse4NmX98yuAPP4IgIPzxU2RTxZsE7N1q86PPxgzTkigqyPOSoq55ce4zywvuKsLmNigrNEkSu9Sq4qGlocsSk1wU9lle0lJkiromKis6qsKfPp7yzmVIu21QRSk//JsjvvveNurJit2uhdXQGOqC3f+La48DUyfPK5KqJq0qlmOf+TrhrYdd5tMIwxBwSsvVUVUJvWUJR7q8xBk10fpNsQ7Z2RH6+bMTpHe+C5HHDUuG+rNP+Pn/+CO8siSrat7eddn9/kPyuYfWd/H/17+k8WgX+TvfQ7p9i/riFOne2+L6OQ6KY1L7MdnpGKuhUeYldVHirRM6hkr58oTLn50xemsL6prkySnm3RH1bArLBZLrirjhwKOcLlEsC6npIKkKUrv9/8Fj7N/dCfMQQzFuilGFKqsYik5aplR1RdfocKe9h6y8YJ0kOE2LMBfuWI6uU1QVhqLg6DrjIGATJ2RZjmWb3O+4VHVNDSRpxmy64r27B6iyzCevzji8tc3Rq0v+yQdvchUE/OLJMbZt4PsRd2/v8vmPnyIPJeI0JV/e6PwXa1ilFKOSLM/By0CRxeQelzC0QIJ6lYqs+kUCuiLgfABH48sfPbsx45FpNm1+8sMv+eA7b4Cr4TZtkfxmCVOfJ5+9whk0SfOCPPmacV9wPV3y4O4eq5VPq9nAMnR6vZZwEbNNTMsg8CNUVaHdaVLWNR/s7WBrBrMoomO0yKucuMjpmg0+nTzhX/75D7i+ngPw7rv3+N1HD5hFEQPb5n8e/5hb7Ta/f+e7/NM7H3K6ueTDrW+hyDI928ayTWRZ5ny6wG5YJHFKWZQEfoTTtPnp1Qknx9e0Wg00TWU2XbG7O8DPMl4ux7RNk4vgiqE14Dqa0DM72KrYbTfUxm/m5vw1jqWrXK4SdFUhLyoMQ0FTZcK0QFNk+je72+k0wLJUOoM2cVzgNE2KosZxdIqixLI0ZrOIIEhvir7M7q2t11SbOM4J/Yg79wYoiszZ2Zrhbp/1wuPugy0Wi5jnTy7RTV1YxQ63iK4voYbC90SRrkqi82OQJPKqFC6bigaxLyb3JIDunijMm4l4YX8u4HwQn6NoROfHnMcpTtuhrmumn33C4N33mV3839S92Y8keZ7t9bHdzPcl9ojcM2uv6uptZrp7FkYDDFwkuAgheEJC4okXBH8E70iABC+g+8hFgBjBTM/cWbt7prq69qzcM2P38H01t33j4WsR0SPdKjQgDYlJIUv3cLfF0+N3vsv5nqOBIcIyZAlYVbzTQ2jvSnsgCcFfsp6vYN5Hqz0gmM2xWk1i25HRO11DVVVM2yRLM+qtOpZtsZj77Ow2sG2dxSKkXZW2R54XtCsGf3245I/+/DmzV69AN9m6f4cP399htoroNCzO+i6VisEPHmzwe/da/OWLOW9uV3nWl/+DxUKn0WkwGbvUmjXR2AdpUyQZh2dLhs8Paezt0urWWUxddg86+H5yJWh0Z6+BF2VcTD1p3TgGRQEbDfsbvz/fDvR5Lgz5IKBIM8ytBnmUEn99jH1nE2YT6cGX1qbqwiX8uy+x/+BHAvLDoWR/L1/C7i7FdIby5hsUz18QvBph3eiQTFzyMMHabQnwJwn6zR2y4YTOB/sUFxf4jy/Q2xXOHw1ptyzOjxf88I0Oz14t0RXIgIap89D1+Y3NOn8zWKIC26bOUZhQ06TcXiCl+kWaU9VU/Cxnx9QJ81yIdVFCS1d5sQrYCBI2v7MvpjQzn5qm0agbfPV0xoauY6squqIQlgGFrSo0dY2veiu6hs6nT6bc2azgrWIcR8c0VZarmLZlgKYyPV2w9dY2ZDnp1EW/qZP82V+iv3UXpmW7I0nAMPjf/us/5Ykfs84KGrpC7yiBo79ly9BZZRlnUcrmZz0e/PMv+d3/8j+mWC3h5DmK7VDUEvI4JXNDjHaN0E+k7+OKR/X0fIn3bMLWVoV0GVD54Zvko4lUGYKgtM5dywTG7Xto778tegm6TpHnrz0ZT4DdoqpXWMYr6maFvMg5cc/p2C3m0YIn00NudFqoisJi5fH16QU/fnCHdRxzvlrxRrfLq/mc3VqNZRDy/v4ujwcjHj894a03b3Jy3KfeqFKtyVx2kmXc3ttkGYbcuLnNy+GEfn+KrqmsVh6qqrJYrLnzzg2OTgcYpeWqaej4RzNufO82Z1+dytx81YB5BFUdrHI8CQX8VJ7LCuhYMAnB0eDCg7ZF72SIWbe5d38fdBXfl9G8Rq3Cw4evRGFPVckcHcsymS1WaKaOqqo8eXaCYRo8fXnK/Vt7jKYLikI0tYMgRlXleGEYs7PTRdNUfC/EbcT87OkrtjZbDLwxmqqSZBkbzRb/xX/3P0DfvwpIvvKf8tVHT9FaNpkbwTiAjs0/a/01/+N/+p8xCRa8XB7SskQExDD6LBdrWu06rjtFURWy8lgvnp8RRDH1qkMcJRzsblAUBXPXo2VZeEnCIoo4W62Is5Su08RPA0zVRFUU4vz1V8bLiwKjHPGarSOadYsgSnl2POfGTp2ZG+FFKZ1OBdeNyr6vyu5uXdj0M59Gw6Lfd6nVLIIg4caNJkdHMxZTn/tv7vDVJ4e0t9o4VQdVld5wvW5jWRpxXGEwWLOcLkmiROxVw4hKrYLR3SIaXQhQZwn4SwC07VtkvRew1qVc75XW2pouYKwZ0te3qlJe103xrq+Iih15RhYELGdD9j/8AFRNxG8M0Qrxn3wix7VrktlPz6VSYNdlZn9yBoZNNOyBYRH1z8CuoTWqBOsAu2oTh0Kec2rSPs3SDM+Lefb1GU7NoT8PSLJcSuWbFf7b//6vYHYu12k6jD6f8KefIxWJy7FBq8qnlSb/0X/ybzBxIyxDwzF1soqoFi5nK5qdBp5bivkApmVy8mJGvpyC6ZClGUUBB7e6zGYB3W5FCMVJzhdPRgzmdaqOwXARCKs/E7ndb9q+vXRfr4sC2miG8b33yNyQIslQbYPwZEIxm8oHm6awXpdgUsX9n/6M4K9+RR5EcH4ubnZBgLK7A/O5jOvlBeHhGH+wQm84aBsl0C+XKAcHaBWLIiuI+wsUTeXjn5+wd6+DH6Tcem8b1dB4426DGwc1HtxpMAxjqqrKV9M1B6ZBSxfW/YGlY6sKTgnMpqLQ0VUqqoKpKqyynF4s/XpdUVhlBbumwXkU8+rnR9zerfL16RJNgeenLklRUNNUDjYcDto2jq4R5TlellMASSEtAFtVGEwD/CzneBEwn0fUawahF1MkGY6tE/cXxBMXioLobz5B32hIv34xu5K5jb54RpjntHWVuCjIC/BLrsHLIOaZn5AWME4yXoUR3tkGvZ8AACAASURBVD//YxTbBlWlmIxJHz4lma45fLXkFz99RlHAy6MVR4M1g3GA5yVEUU6eFTj3tsjHU6nM3LgBpSsYSelbMJ+Is+BsCVGEVrVee637iu6U5Ks5N+s38NMQBQVLMxl4EwoKtitd/CSht5Q+mqoq/M9//Ss+Pzwlzwtezed0HYdFGLLfbDAPpNxWqzk8enxMGCWoqkK7XsVQVXquy81mkyzLybMc3w8xDY3TizHb2x3iOKHdrqFpKu+9c4fdrQ4P7u7jrwPYsDl73kPZciQyzQvYqwroq0pZ+lShbshjQxWQLwrp3QN4Kdv7G8QXLs+ennL/7h7PD89RDJWL4UyOoSjUqg7ddoMwjCjchGwhVsbkBck6xDQMeoMJfhCxnLtEobDwkzjBMHTSNMN1ffq9CVmW8eXRGTtbbVq2TX89Jkgj3Djm0/6hBCQt8zpgCTPIC7KBB8NA7iVIyRcR/9XP/g/8RALR/nrGn756SuCHDAZT/ubPP0VRFM4vxkzmK46OLpgt14RRjKZpdDdbhGmKF8V852CPbqVC1xH1OL2UM+7aMt+vKmpJyPv2fOd12FpVE0NTmboRv/fWJotVSBxnpGlOb7TGMXWSLCcIEoIgwbRN3KXHx3/xFcPBiiBImc0C8VYPEg4OGsznAVmaEfohX3wsanKWbdLtSuKWJBkHe3WiKEPT1NKtrkYaBjKyBlf9+urBTZxOm60H9wRkVY1sNpTMXTMEBJ2GALuiCiGvyEtwD2Wyx51cP448AXvDBH9B/2SAuXuL1dmpBBKLoQjumBXhI2mG/CiqVAoUVY5jVeScSQS1Nlw8Yz1boigKkS8gGwXR1U8SJfRORrQ2Wuzf6DJeBNhlJv1iUGq/1Lpyj0YpZgYQ+fI3aDpXz//Jzw6ZLgJm64ilH3NytmQ1d/FWHhdfPyFYB+T+GlVTWY5n5EkCwQqjXqfWqqHrKotFyP5+g42OBCJBkNBsilLl3e066yChU7NoNGy+ZYz+/wboKxVIElI3xP/Tv5PMcB1ivX1bGPjDoZTsb90mHS/Q6jbB0ZjKGzvkXoy6uy2iOkVBfjEQElcYkvkR87FHHCbYdRvz1jbKW2+TfvqVVA/OzkDXUU2NzItQqyZv32owPV+iayqToxmaY+CuE4Igw10nqChsGgYKYKkqLV0jzAtmSYaKAHBeFMRFQS8W0Rs/K4jzAg1EJQ8I84K/Xnr0opTPJi6uG5PkUto3FIW9pkW3aXEy9smygl4YoSkKFU0lzgtqmkZOgZtlPPZDjsKI/apFGGVohkb9Vgd37JFlOdNpyPDVjPBsShbExBczGVnsbF6RF7Wazb/z49t8v1bhlqVjXgUoBVFR4OcFVVXhnYpFXVN59XgCB7fly+37qKbGV0+mXIQxfp7x+fGCjbZNyzJYpRlLL0HTFKazkGTuoVYdES8KAmkp2Da0WihvviM8AdtG22iTzVYUWU56OviHr1r/iFvTbBLnMdNgycPJE+k/JwG71e0rVby92hZ3223clYduaAyHc37wnQeEQcxeo87NhpToB9MFaZ4TlUp4vf4ExzapOBY3trq80e3yxck5jmHwfDq9EnYxTR2nYtNsVBmN5rSaNU7PR1i2yWKxJk1TJpMFhBlauXAC0LZEpOZ8LaCf5AL8biLjdQ1TzG5UBdJSOc/RIckYPuxBktN/dsF04ZKnOcU8Is9zdrY7tNo1lss1lqHjTdYyqudIm0k1NTA1ojjGu1gRny7pbDSZzFcURUGjWWM2W2GZBheDCaPpAr9cNOfLNfc7G9xo7OLoFrauY+s6/+E//T3e+u49aTNYOqVEJSSZAH/NgE0HagZnZyPe3rhJ3awSpin7zQZPn58yGi9AU3h5dMHulnAgVEsnCWPSLGM6XxKFIht80GqyiiIWofgT3Gm1+N2b7/D9rfdZJx51o05ajta5pcb+67w1bAHW5Trif/3bU0xTJ0lyvv/2FnleEKWZaLG3HYK1APjiYsT2vRusZiu63Qp7ew3Rq1mJ3G1RIBn6aimMcENnd7eO4+gcH01RFIWjkwVJkhFHQtrL8xwUVSxgG1V818e0TKJSrnl8MYbIR93YF9BLIwHvWhfikoOUxvJ8HJRgXpac7br08a3qdTBw9khc7gZHxEEZ0BaX7eFtCR7cqVQJlkMBe9ORn2pbAH4thktcPEe9/31IIpyaQ3OjSZqkqJqKO3dZDifEUYyu63grj2rVYLdbEb16Tdokv/Ovf8jN7753pbiHpss9XN6nXUft7kK1ReAF3Nqp06qaBGHKxkaFde8cZn1IY+LeIU63W/5tR9La0AyS2ZgkkkC3UjHw/YT+cI1parxxp8P+Zo0bG1XmXsyd7TphkmEYKqeTbyaVfjvQl734Ik6pvLFLHiQyFrdYYN/sCgO7UqN4+QL9zgGZG2Lf3CBbR1S/e0dkcddr0udHJNM10cmYpDdGr9ts3N+geqtL5e092Nqi+PIL9LtlFtntEvdnhEcTtJrN6nzBo5MVRVkp1lSFed+l4uhYlspwHFDXNE6jiDdqDjkFVVVjw9Cpayq2qjJJMi7iDAW4Yxv4eU5aFGybOg1dZZlKn72iKqSlet4izfnZyOV5EDNJM2apEFvCMMPRVObrmE1DqgcP9utUNRVHlRZAWkC7DDYmQcLEjZlPQ/I4pb4p3tzLVUx/HvLl0ymnz2fiDzAayUhdGIDjkAcx9p1NNtsWDV3FKa+voiroisKbjsFdxyDKczRF4f3//N+WdspqQdYb4j8fsM4yojwnyAvSoqDbtTn2QjplVK6pCkmSE/fmREdD2NkhP7+AyQRevIDlkuLFU4r5BKXRhEYDbX8bc6tJdD77f7Ju/aNtQRqgoBCmERtOWyL5cm7+UrktSENOl0veu7XPaunR6dQ57U/YO9hkGUlW+rI3JE0zXvaGjOYrNrc73Lsjeuu3b+3IaNdwyFa3Rd00udtu0+9NGI+ERdvrT5iey4LjlwS28VgYt5qmMV950jdfhrQPOhRBSsW2YLsigG+o1yN2jiblei+RTLlZZvd+KsHAr8/T5jB/MYa+B3FOQUGcpsRximEahHGC3rBRTY179w6uxFhYRBRZIedydOZLl+VijVfOJne7TRRFwfMjFiuPl0c9Ts6Gsth7LiN/yiJcUzNNTpZLdmo1LMsAU70GeU2R1kTFkH2p+Pfv/d4PaVp1gjRk6Hm8uBiyXgeyyEc5aV9K+OHCR9c0KAoqtkUUpyzma0bDOU3L4mQ6Z7he82lfhHx+dvaEn/V+RUV3CFKfnIKGWaO37v9/8M38h29JllMUBRsbFRaLkCzLmbkRBzt14lTGq5bLkINbG6RJSnWjTZqk7N3aIopSlsuQw5cjojDi+NWY1Sqk0WnQ3N/FrtjsHHTJspx+32V7p0GlYrC7W2d0MSXwAgzLYHoxBl9G8RRFCG15nlPkhZTB01RIdYMjycYVVVTyLgFcN0tFu7K/DleEOlRNwD10r5Xz4Nrydngoo3EggB7HcoxwLUC5cx9iH6W9fZ3VL/qyT2NRdF1LWyH0Q9IklX69psnzoUd4foQ3HKAb0qNP0pz5OkZXFabzgIZjYJq6BCdpLHuncR1gNDbI/TUshtx9sE3F0gljUSJ88Wwo952EV/egKArpxSupDBTFlbWvv/YZD+ZUqwaT8ZowTJlOA16dLhgvA16cLwiilLOpJyS9isFw+M1Szt9asyqmU5RSAIfdXSqWRb50ScYSAVu1migg7e9TPHtGHqesjqdUN2sc//FDbv+rb5GOFzIrjmSnWt0mW0doNRu9XSOduqjhK9S33kBptigOX5F8/QJzu0VwOuXk5Yzjdch3bwjpy7REha7hGBQNkf5blKNy26YYNSR5wXES4agqbV3nPE6wVYW2rnIepVQ09Qooe1GCqSroZdljmGRUSqe7XpTS1IXYt8xyGprK07mPosCOYRDmBUmRsWkaPDlfsVO1mPixZPdFjqkqbBgakyShoql8uFenyHIm5yvyvMDQFYIop6FrnK9D2r0lxtYUrV5HabVQbId06ZMHMeN5xHdqDi/8kPuOzUUUY6YZ24aOpoCfpXy4WUepNygefQGDAXmSMRn5LLKMQZxx3zYIi4K/fDxm1zQ4CSN+85ZIwhqG9P8ef3TGu8sAtWri1Kso3/8hxcvn4Dgo1ZpYCRsGyWGPbB0S+a93f3OViEOaoek4uo2uylfeSzwKCmpGja1KlfudDh+dnhGFMdOFy2a3yaefPeMnv/Euh/2xKNZlIoBjWQarpScz3fUqs+UaL4x4d2cbVVF4Np3y5OSCrZ0Oh696fPbVC3I/Yff2NqqqkmUZK9dHURS6LZUgikmCCNYpSsciihIwNPyzhWTopgqzSMC+YUqpu6qXctRIdl81pBwOsnd0WTjiTLL9AlBy8jRndjiW11uasOyTnK2tNie9EVvdJheDqbx/EcleVymmIVQNDg62yPOck1J21rYMEi8iU1UWwyWLdoPnlSlVU/rfuqoymS75Gjg9H3HzwT6nFyN2NzvMV2vCmUd1o0aWF4RDl9pek6Zd4eH4BSPPw40ihsO53Mc8Qt2vkWsKn/z0C+p3u7iDJc29NrZpUHVs4iTl+HzIYDTjYG+TymaLHx/c4uux2A+7sQR+jl7h5eIILwkwtNe/dH8y8dFUBd9Peetmm+2Ww8pPWPoxSZLTqJq8f6PJYh3x8sVYiGhFQZqkTI9O2X1L5K0txyJYB9SaNVotm+k0Rzd0mk2H+cwTUteGeLv3+y7Tqc/B7S1ePDrB650DoLU2yLOcdb8Ppk2wmpeEO01K2GkMrV2MSoVkXcDFM6iLUBfrmWTBl6S8SvOagR/716X9y03TBQTXM3ltnl2BNqvR9XOKAosBNLcpvJVk84s+NLev5vUxbGkPVNvUW8L9mA0moKgo1QaFv5bqUuCyHIxpbwj3AaDmGEynPp4Xc3E6pnP3NrN+jeZmG9/1SeZTnK1tAi8olfcOUBQ4GrrEcYbvl+Y5cSBBAUCR43/9EerdD8kvXgrXoNpGsSzCtQ+TM74Yjti4e4ssy2m3beZzcSLUdY1GxaRmG5yO16RpRhB8s/jTt/vR9xcQx+RxinL7HulgRubHmN9/F2OzTvDTn4O7RGm2ifsL1hdLOh/ewB26bGxVeP7Hj9EbDnqrgmJoFFlO5sckExnNylYeqq3LiJmqUvQviJ6dolZMVp8c0u/71Gomv/f9fXRd4azvcXa25nt/+CZ9L2Iw8jmeBdiqgqWo7NUtdF1llWVoioKhKFzEQsZzVIWkKGjoKraq0ItTdkwNVVHIC5in4ltvliX8WZKRFrBMc9JCkpBnQUyQ5wzijOdBxDhJWKQZ8yQlyHMu1vKlKIoCFYWKqqGjUNc04rzgxaMxv/jlmbjXAbqh0tA1igLqmlhO9j46FqJbFFEM+pibDbyhS8XSeOGHtHWddZaxaxp8UHWoqCrrLOeWbbF7owGA0u5CrUZ0PmO6iGhpGjumxmkk1wtwEkbsmSbPei5hmGJZGpW9Fh/+zm2GfXG9C573CP/oT0lP+tKjrzXExvbB2xj7GxhdEdd4nbdV7BJlMWmesVfdI8oiZuGcO407VPUKfW9InCXUDId+b8J04fL2W7e4GEzptup8/PkznIpV+qIrJHGCu/Lw/ZAkTpgt1yiqQt2xceOYF7MZg/4UVVU4PuozX67ZaDX48AdvousaZ6cDhpMFv/nDt0nckMFkznK5RjE1jM0qVccWfqMbQ91AbZiS/ZZGNmiKZPiOLs/XTSm7RxlkuZTCzVIP30+vQR7k3wNf2PpBCpOQwk1gHjGeL0kXARe9Mc1mTYKIqiHWuJoio31ewqMnR/zqk6eEYYxtGjKd0K5SIDa6iqJw+KpH06qgAKsootGscXTUp1WvcnrSl0oF0KpXObi3i2HohOsAY6PKZqeJisLd1gEVwyCKEsazJXrVgk2H/HwNXgqWhnsyx96osVp7RElCu12nXq/wkx+9T5KI0UevN+aPvvyK5ycX+ElC06owCWdYmsVWpctmpY2mfLPQyOuy9aceYZIRxyk/ut1gMA9YBwn/wW/s0W3aDCceF4sQ29SZ9mVMeevmLtHwArKE+XguTH1Dx7AMNE1hNvPxy4BzsfBRNbU0FE0Zjz2CdcBq7nJxVo6BdjbZefdtqo0q2VQy5Tvv3hUgXc/BdwV0TQc0TWxoL55Ce0+y+ciTfbVdZr+bso/8cuSuEBAvNezJs2uCnqZfZ+mGLaCuGTDryTFW4+tz+EvZV8qRO0WRioLpyHsWA2b9MbNPfy4233EolQhFgUoDWtvYrQa94yGOpUubdRFgWTpnhwNCP2R2eIxmGDJ5UKuwcf+u/EetpnLeUnPjznYdy9IxDA1vIkQ7gtX1fdY65JOegHwaw2okSoPNOjd+9GP0ukjJz0YLjg5nnLzqM5sFNGsmSZZTtXX2N6p0mo7wa75h+1agT6ZrIdQB3HsHtWLiH43JXx6iNJvonZoA0sMvMTbqtN/fJzyVRS5Pc+799h2wLBTTJDqfoWgqRZqh1UWpLfNj1o8vyMNEJFg1Db3pkK0C0jTn4KCG42hEq5DhKGCzbXHvzQ6f/vQZXdNAUxQOGhYdR2YbNzccdrYr+HlBTVUJyvK8Bldz8qYipW8VBQVoaAqqAk1dRQEcVaGpq0RFUVYZ5cPTFSH0LdKceZqxLtn2izTnPJJxPTfLmCQpVU3DVhWSIqeiyRylpigcBxHHYUKaFdRqBpWKgamp7G451Cyd/tCns+GQPnouwOp5RP0Fg6HP4TpEAeZpSj9OeBaEPPYD5lnKpqFzd69cnPOc9G9+RvbqhGfP5yzTjHWeM0sE4HcMnTjPsVSVZ0HI7Y0Kk0WE7yeotkEeJtz6yR3iwRLn3g6KqsgopOPAcg6TIQwvyGZL0oWPtfnN1oivwzYPV2SF3PteRRwGJ8GCntdDU3XqZpUz94Jzd87WTodbN7YZ9qdUHekbfvjBfaq2RcU2iSMZuzNMg1o5DoMCpycD5iuPJMu42WhIaQ+wTJ07t3ZwHIvhcM5itaaz2eLtN27yi4++Bken6ljs725gmSaKAndu7dJq1CCWEnzulv4HhioAHWZSrs/L0jfIY0uDso97RdJTFQH5f9kCEJXZv59CAcUokOMAy4u5BAtA5sWYDVt667rKeriCkYzV1esVGrXKlXqebRkcnQ2pVR3+xYvn1M0KfpLgrX3WXsDRifA5fD+kfzRkcDLi/GjAYrHGqtp88O4duhtNGlaVP3n1JWPP4/mLM8KlT+rH0rYwVAl0CqBpEJ4v2e62WczXzOcuG5stLs5HvP/ePaIood1pEEUJG5stbjQafHfrXVRFwU1cgjRkEbmY/z/I6OfzgFVZPfvhboeKpXN2tuT/fDjmzlaNg506ZxOPs96S9lab5u4my+kSvbON0dlka6+LaeqYplb206HIC3RDJ0sz0iRleDZktQxQFIXt7RpayTFJE9G2T6OYSX/CajBC39yjub/H0cNX0N6BegersyEgrOrY9SrNblMANs8kGLBr8nhyKs+pmoCdUy9BWBfynOmU5XBdgF3VBPDNUlisyOU9RX7drzcdyfDDtWTNcSDHNisCvJXWddncacDoSEC/KEA30G0L0phKvQLegnAkxjmPnwzZbIksrbsKxP62bBlk6xXuyRHLV8+ZnA8JliuU5gZWdxPLsbi/3+QXn19werrg/PHLK1EgrHLc0K5fX3u4xjq4B6ZDMuqRJinj/pSDu+IQqGpSCWx2mjiOQd0x8MKEL19NmK5CpovgikT5L9u+FeiLLKeIU7I4hc//lswNaf7WG9KXzTK0iikudDdvomgqWqNKUZok9Hpr8S7XNHAcIehFifjbxymZG5CtQ1ZuLE53jiP96RJYiwIUXaO6USUIpDe+WMYcv5yTFgXtlsVFFLPRdag3THaaFl8eLfjoaM4NyyDIc1q6RsfQ2DGlzA6SmZuKcsWyX5eEvI6uUQC2qjBP8/J5WSMrmgQHl4x6S1GwS0KcqSosUgH9IM+JC+mD5wUYikqUF2RFgaaIOltbV5mHUmLJMxm38LyEIM6I8oLDV0ui0yn50+cU7hr7RgdNVWhoGlVNIy6KUs1Ptqams0wzPD+l+pN3yT79jMd/9oL+Z2eMkgQvl75eWL5nlmZERcEszTiwDP7uYkGrbsrYx/Mheqsq/gINB/fTQ5nzv5Q7dqqwuUPhexRxhrnTxL6z/Q9ftf4RtyAJibMYPwkZh0OyIuN2Y5+BN6ZttXBjj+3qBr9z430sy8B2LPK8QNc1hhNReMuLAlvXaTSrJEnKfLoSQRkvYO36uF6AaRlYmsar2Zx2p4GmaeiGqDU6joXr+QRhTBwnHJ8OIC+4e2MHzw2oVR12N9t0Ww0ePjmi96iHuX+5CKhgqFRbFcnYL7P6AtnbZTYal7+7BPUkL/+IuH5eUcrF7teChaKQTD8vH1+W/6P8Sls/XkdyvLgk/NVNVisP1/WJolj4Jmsfr+Qe9AYTZtMlf3t+jK3r7G920HUNyzGFbOincl2qAgqYlkEUxsxnLr917zYDb8bHXzznq+cnzHoz0Q8oCqlCFIi2QJpLO6NrMzgasr3VplZ1ePT4iFa7TlEUGIbObLJkc0vaU14iIlqbziZuLO3HmlFhp7L5j/Jd/H+zFYUwrtM058IN8KOU99/a5PTCxYtSjs6WHHSrPLjTodaoQCFs8jRJSSYDYWWrCratU6lVSNOMJE7Is5zQDwm9EBYj6g0H29YYDNZUajaVWgXbsdFNneZmm9TzochJV0sxsfEXNLotCD10Q6e1v4NerRD2Tlg++ar0nxfFOry5gNxluT1cyz6JSpJdIYS0LC0NcVTZ66YAdxJd7y9fUxTynDeX12VJSQK8BNVKSewqSXNmRcR7ikKuQzPQTFPuKw7wJ5Ornn40GRIFEU9eTum2HW7cFM6DrIe2BBKxfB5MzzGrVYrxGdHK5V/7g3eo2QYnT44Y96fgLYQUqChyz5En17GeSdBR5ETnr4TP0NoiCiIa7cbVWKuiKDgVC8s2UFWFMM7o1Cw8T8SJalWTdvOb5+i/FeiV0h0nLyD5+S/RWxXQNBl7O7wgPB6LylBnA+Xtt4h7U8ytBk7NJEkLZi8nuL94TDFfkK0j0rnPaBxgbDXwJh6rixVZWpDOfWF5myZazaHIcuq3OuhNh9HJEtsSC8VqVefGQQ03yzkceTzoVBiNfE6GHufLkCd+xNvdKkleYCgK51HCaZjyLIho6SppOYGkK7BvaigKuFmOrigs0oz7jjDd06KgpavUNOWKjW+VIN3SVSqaSo4EBSDHi8tAIi1EUjcuCtqmLvypTIKfpBxFc7OMStPBDxLcOCUIMqqWRkHBMk75xcMhk89OSRc+vV+dEic5e00xU7AUBb+sTqyzgmmachGn/IveApIEdaODoas876+ZJhnjJCUqRV3SQgx+LEVBRSYNfrDVoL8IOe952FWTIs/RDnYpshyjW0NxbAp3DefnFBfiAa1s7aB/7z2ShS8a/K/xppT9PkszeTJ/Wo7WWSR5ytP5CwB0RUNTNG63WvR7Y+qNCoau4Xshn33xnOFoztT1CIKI1dKjN5jQ3Wiy9kIuBlMcyxQ53CyjW5X+ZpZl1OsV8bSerajYFrZlUBSF+Lf7KUfnA27c2Ga58jg6H9I/H8MooPtgi3gdCfs9zMBNxMimZQlA6yVgG6qA3yWohxls2NdkPF29ZuKryE9pgYutSeCQ5pKtq4ocOyiB3lAhzalV7GuQbZa9xTgDL+HgxpYQ+5IEVVGwLZMkiFivA549OubJ0xPmQcD5aEat5vDBu3fJ8vzyD0au208lkFhEHH78ijc7N4nSFFVVePHsrGxJFNckwyyX87dM4RkUsHt3h+HxiFeHPbY326Rpxq1uG8s20XSNlmOjaxrHiwX/y7O/4evJU6IsxtFtkjzFT7/Zx/t12S7lKqIo5Z992qPuGDiWLp/TxQpVVWhUDDo1sT0Ng5DWpgAwWcrXP/+CYX+B64rQkDt3mb98QaVeIY5ifNdH7ewwn7pomkqn46CX36NLURl37mLUqmDLSFu9VQfDZnV2SmVLAv7FWY90OoRghXnjgQC707gG6EvgVpTrUrrpyO9V7fqnviFArGrX/vWXbYFK8zq7v+zhV9vyGt28muO/Ivil8TXDP43+frVgdi6Vh9J/gqRkz6tC8lw9+YLe8YCvvrogjjPuvXXAD3785nWf3Z3KsRWF+OJYzuVO+CfvbdCf+xCHRL1jCQjyTAKSOChL9z509uV+qm207Vswu4DVlGqzSuAF2LaOZRsEXkClYuK5AZ4XM1mGzNYRN/cbKApkecH8WyRwv7VmZe62MH70Q8zTP5HPYO6hrAIyP0Kr2zg//oDs5TGaaUKvh6JrqBWTi3OXbtdm7SV039khW0cUWY5iqOzeaTH4+Jj2bp1532Xvu/sYB1tk5wMUU0fRVKx37oga3pcn7L67zeEnPcI4w7Y1/uLJmK6uM0tT7jlVHs3dcty44C1HIpwgzzmJUhqaZN5pUTBJclq6mNtEeUFL13gRxHQN6Z/nwDzNCPOCbUMjLuSYAujyhV9nBQ3tmrhnKlzNzFc06ZV3dQ1TkTZAP4zx85yqqpJTkJatg6Qo+OLhCFMV6dBqVWQRNzSFhZegKRrPz11qfZ93fvsWu5rK2ec97jYcHi085mWfvVme88AyeG+vjvfJC6y9NlGcMU1T0qJAVxRp5WrXMV1SFEI2jBO6iUkOaJqCVrM4+dUZ2yeT0twlJxi5JHFG81YH+9YtsByKi1OyR09FY33h82sDYa/dtuG0uN+8x8ifkeQJSZYC4iznxh6/sf1djt0TTlcDLlyXrR3JPl+8OqfTFWe5jc0WvhdKdl6x6LYbPH5yTKdZQw0U9vc3ubXVZbByxXjFNnnzxi5j38d1ffb2Nnj09ATbMkiznCefvURtWuTLCHVPYXA8EmCNMvTdGmkqbC063QAAIABJREFU7PN8thYwszXJZAe+PNYVAcDtimS1WgnSCrBOBQhLDfzL+XR0VRT2shzyMlDQ1fI1AOUxDFX2qWT0aze4bg0YJSs6K0At+OXHj6lUbIq0wG6awk1RFHzXhwL64xn/+1/8ih/95rt8cGufj58d8faDmzx5dnJNLtTkWuzdBg/u7vPTw68wNJlbvgJ5uNYP0MrKRElcLEYBq5onzH1NIc9zDo/7TKcrDF0jCGM+W3ksVmtu7G9x0GiwV9uS6YM8xlB1vOT1B/qdnTpv7jUJwww/Sll6MeswwfdjajWTH76zxYv+irwoODqa0+zI6BhFfjVLbjkWUVlNbHQaZNk+s+EMRVVwag4bOx22tqpcXLiYpkalYlCriRHX4GJOa7PF7OUr7O1dwvmC2eOvyuNX0Q2d1cmRXGwSwcYt4kWpUpdEAmatHclkk0iAOY3l/UZVsttSEU9Gg6Wa9veyf730S8kzAe1SFe/q95cVK6ssYdc3BFSL4vr9ZkXOo1vSK09CZl9/LiY3uglOSTCvtcFbgl0jnQ5ZFdDu1rh9o8lg7LFz/yazYY04z+SadRMqDUzb5Oa9Xf6bn76S6RXTlnNDSSJUr68PrkmG4ZpsVlxxEhRFwR1Neb5cUyQxuu1w+OycNAiIozatlo1laHhhSpLlaKqCZX0znH9rRp/OZS5PMTT0bp0iTEhnQtRKRivihy/ErjZN4d49YdR7EXsHdcbjgHrNYPViRLrwUMu+ZbgIaB80QVPZeGub2ZMB4z9/iFZ3UO/dQXlwv+yBZKiOwVc/PyFOckxd5eXU537NYZ1nVDWVz3srenFCVVVZ5zm3uxU+dX2CPKdTRqMdQyMvJOOuaxoKMEulzO6UGXlcFDRK0NQVrrL1HVO/ChRqmkJFlcDhUrQmhyvQX5eVgQyI8pyoyHGz0juaS3KeiqqApigss5RlKoS/hZtgmRobXYf9rQpNW8dSVBxH49HPjsn8iN07LWxbw1JVvl+z+f1mlX/6YJN//ye3eX+/ge+nVB5sk8zX9Eop07qmEubSmmjqpdRvXqApikxyAI8WHvstG8fROXs6YfdGA9UyWHspYZhhtys0DlpYNzYo1muKk1egKGh3bmIebMhI4Gu8BWmEqqg4uoWj2QRpxCxc0nGarCKP0/UZpmpys7HD93b3KfKCJE7ptBqsvYCNdoOToz5RFNNoVEnTDF3TuHmwhe1Y3Lm7R6835qMvn2OZBj+4ecD9TodZEJAkKXmW88UXLyiKgjhJCYcue/d2yKeiVHfyyRH4KXrFhDin06yxfDku/R8MAdxyvh1HF6Gcy/77KpYyfI68rmYIqF8CvK0LuF+CfGlrK+NtZbleK62G0+Lvj+Xp5TjfsizbK8ixQa5BF8KfP/cgyQjDGMsw2Ow2aW80MRs2mqphmTofffyIJ/3hVUkdVcG63WL3nX3e/8nb/MEf/saV3O5OrUZvMmd4PJLru7zmIL3mKQCYKsUqhpaJN17T7tS5sbvJ6fmI3a0OpiESv14Qsr+/yZ1bu9zY38TWdT7uPyYvcmpGjZbVYB6+3sZMANOpz5tbDpqmYGgqSZKxXsdsblaZTn0en8y5v9Pg9ladd9/aRNM1nKqNWhMyV3NvV/TcFQXLsRidj1BVle5OF9ux2dzt0D8Z8vCzYzodh9s3W2xtVMnzgvU6JkszZsdnUG0SjkfCaO8eXPWbV1//SoCs1i7H4yLpmWvGdUZ/2T+/krot+V9JeK2QV+TX4jeX2fyl8A1cZ/VJJD1/kDK6olwHAiBZ/eWIXuxfBwGX71cUef/l9a3nZXthCYqKqutQbUpvP41Jo4hXH33GZ5+fM+gvyfOceD4Fu4ba3qJ77w7v/+AerY0WF6djLEtjNvNhdHw9hncpsGPY1wTDNJZrNWwJPAwLunusRxPsVgPd0DGrVdIkZftgi8Zml0a7TpYVnA1dvDChbhtYhsZ6/c1r8bcCvbFRF394BflgVAXnt95Db1fF2cfQyI5OwfdRdB21UaMIE9bLiFu36qRZwXIV8/LLIcHplGwdYVUtVMuALOfiywvWXkKlacODByK7enEBScL64RnPvhhi6SpRmnPiR+w5JlmWs2kYxHnBNEnZMDQyCt6s2HwyXLFMZSXo6BqLVIB7mmZ0Sj96XVH4TtUmyAvi4rr8vkjzqw9DSvwFoyS7EtWJyxn0UZKTU2qUlGBZ01RMRTL9sOzTx7lkN5fHc7MMW71uBWSFBAQ5BYYqgjWmqVJxdJpNk822RRTJJMDDX/Z49WhCrWbwg5stbFVlmKR8cbbkpx+d0ht63P3RbY7+6iW//PicSZnxm6rKtqFhqgpeJp/Lnqmhldr/H27WsVSV/iLE0FVeLXzGFy6T/ppaVae7X6dIJauPhwuxEW62xMnuvEd0PCQevd6L5G51Ey/xSPIURVGxdIM3WndpmHXCNLoygjBVk67dpF6vsF4HjGdLbuxukucFveGU0/MRk/ECVVUlc1VVsizn4deHrNY+7XadNzodeq7L88mUOM04OR7w/OExpiNs9GjgoncrzJZr6geta6BtCkfi4L0DRoej6xK6olyT5uaRAH4sj439hgBwqWQHiJBOWpa4FaX8Pdf9z8ssvnS/k7/p8oOyNXkvlLr6ZcBQ0eVxmEGYoTi6BBDpNSdAq5gkacZ0IUY5lmGwu9XBsWXhjYOYX378mI9+9Yjl0uONO/s4tkn/YszDz17w53/5KafnQ37rO2/wl58/4ZMvnsn5glTur2mWQkD59chgWfq/eWsX0pz5cs186eIPVmRZxnAyJ45TNjtNXNfHWwcMRnNWUcQf3v4JhmqQ5AnL2CXOXu/2E0C77dBfice8oig0aha//8EOFVtGilVV4WgkvIOFK8JI8/GCPBSQTOKEYDRkNpyxmq2ot+rkeS6jYXHC6cPnpPMx3Z0ujYbFYLSm13eZTj1moznrV0/lQi5H35rbkGdUD24KoF/23osC694H0mu/zLyLXED+UhkvjWRv16G1K7+Pf62qshrJc5dl96wMMC9Z+FkiIAllAOHKsez6tfhOtS2gmaUlV6Uk9YFUC66IeKacK1hJKT32wV+SZ7mAvWHKdYei1Dd/9AXTJ48ZnY/QG8L9yMfnTB9+zsM/+znjizGdrRYPPztm+PywvIay+tDcls/osqUQ+fIYoN6Ra1kL/40iJzx9SRKExGGM5Vj464CiKPDXAYtFyNu3OtQcmcUfz3z8bxl1/lag1+s2zOdX5bM8TmE+J49SrIqB8YMPSJcB+WhCMRxSeD6pG9K+02HtJpwPPVarmDARqUbVMsjTDEVVSFYhWV6we6NJ9d/6bTnheCykrzwnTzJqNQPL0hgnCffqNss45SyMeeqH2Kr0yfdNEz/L6YUxvSilras0NE1IaMBFlHLXNlhnORpwHqVM0xQ/y9k1NUxFIciFVCfZu5TkAfbKjL5jyBieqSrl/L0ECBVVkYy5BIvLNqiQ+qRNsMxyYb6XLH0NBb80/tEVhawoSPKCZZoRBClWTQxYajWDnW2HZtWg0zSxbY3pLMTzE944qPO7b3b57bc3+Dd//z6tmsn00QUfD5ZMkhRdAS+TIGGYZPglyLd0tbwHEfYZLkPiosDLc3w/4V6rQpoVTOYhy2VMuAg4fDolS3OMVkWMipIY4hjlxgHWzU0U/fUeTUqLlDiP0RQNQ9UpioILb0CURdi6xb3mXdIiZRYu0VUN1/VZLte8ef+AhevRPx6SpCnuOsB2TDRNRSvHkFYrD8PQOdjb5N/93nepmiaLMMQ25DxFUaA1LBzbIpl4HLxzQF7khOM1bm+BUpPFprnRIA5izo8GAmCODo1yIVIVmZPv2gL+hgrziORSBatWlvbTsnTfMK8Z+boiJXel3Gcl0Nua/N4oe/hxVmb6ZWk8LeRLnJWklkshnjSnCFIR6pEPF3SRRlVVhcSLSNKMzc0WWZZTdWya9SqNVo1KxRYuw2LFeX9Cxba4dXuXt96/w2/+8B1MXeewP+bV370gnwRyvSDnDiTIuCIgtiypKlR0LoZTuW4voWJb1PaahFFCFKdM5tK7fnnUIwhjWu0699sbfDV5wrk7IEhDTNWgYnwziel12RoVg/5SSroAQZTycuiRFwW6rvJPPtxlPPWZrkJcNyb0QvzZgubuJqga/lisYNPFBMMyiEsSJQrSaskzug/uc3DQIE1lPdA0hXrdJvRD2DhAbzTBnVLf3xeAXQzxXj6G7k3Juu0KRD7RYi5gbtdLtnsO1ZYA7CX7XVGuzGcAAdNLRbvGlgBxEl3r55sVAU1Nvwbzy+2yp+5Oypn8irx/PStbA/bfl9bNM7mW2L8W3DEr16V9RexknZojv3fKCoRh/9rkwDnpYlK2IVoY+/doPniLwp0zOBuRHD+W4OFyuiBwr8V+LomDl4GQ6Qi4K6p8DiA2ve1dWE1g3ifLMuaHh7gLF8uxuHu7zcXMZ1HO+VerJvq3rMXfPl4388CyUKsmyVj+Q9xPXqG3qhidKpyciP59msncoG1R/d498jARNn1RME9TmjWD8STg1bMpkZ+QrgJUXWX3VgvnwTacnMiN1uuQJPiPzgmClOkq4ngR8MF+g1mQ4Gc5pqKQA4MkoV6W26uaxjrP2TY13q7YmKoA6B3bKAW4pNzu5WI+E+YyG98oWewaksFXNCHgNXSNlq4xSVLUMukxFaiUo3d5uf7FRcGGoaGXLH64zvI1BZZlZu3n0isP85xllpX3UFytpTnCyj+98CiynCTOrtiWuztVmg2Tdstia6tCs2FSqRhYm3UWy4jF0ZRazeDx+YpJIu2DuOQNpEXBnqnT0lVWWYGtqqWpT8q2YTBMUh5Ubbq6zjxIma5jBsuQVkMysfkiYmvTxqlZxIOlEPLCEEyT9PkR2WLNcv7NBJDXadNUlVXsEqQRY3+GrupsVzsM/CFRFtOwqkRZzM3NDv/K998hSTKm0yV2p0qe5my0G4zHC856I8IowfNCGo0qm90mP3jzDg/HfdI8p2lZJHlO72yEH0ZkUcJysWb73jYXoym5nwiIJTnFLIKmiev5KGWPnqqOtV3DrNsCbC3rutyuULLrf210rhS+wVSvGVtV43r2PcmlLx9m1+CvII8vZ+1r5TEue/1pSZhztOuqwWVvPC2ug5GyBaDqKlEQY9VtesMJcZwQx6L/X6s6bHaa1KsOrUaNjXaDemma4jgWRV4wHs1pNqp89tULOdclCbFtCYg3zFIfP70OThYxetUiXYW099tY7Qqj2ZKiKOgNRZrY0DXWXki7UaNecxgNZ/xd75Rl6HKrsQ9ATs7Ae73VHQGWXkzV0rEsjTSTrP7p4YyKpdNq2QzXCauVGNtomvTcv/OTd4jDGOZ9AThFQf+/qHvTWNvS9L7rt953zXveZ77zUEPP7vbQHdsRthPbkRPijBg+BDIIEBIoCAmJD4DUUaSAEBIIBb4QCUQUKYoSCwKysSOIjd1xPHSXq4fqqq7pTueeeZ89rnngw/O+e92KXBUTtenKkq7uOefuYe1919nP8/yf/zDZZX4+I09ztNZkibjjeZMdJtM+Z2drqqohMoloT98/pc1z2CypTp+gxnuszq+keGmDCs1PxOJ2IcmEpMZSeLjbFd54bGBrM+HaIlzlUpSHe8btLpUCXGYG2n5hn18Vcr8ilaJoofi66ixore2unfjLTAru5lruY41z2rYr7ttmIpLb9ydUz9+ltUhZUcjuPujJaqJIDSFQpICDwwPKTHzy9WiH6vxYnt8+px9Lo+OKf8R2xaBd+bdoKM3GaE/WBfMznCAQjX9vAsNdtNYwkMCh9XzNu+/PyIqKV26NSYsKrRxmlx9u5fyRhV7HJka2bnCMAYDyXNjfx/Fdmk2Kjn30g7s0xycSqZokLE9XlGWD7yhWdY3nKi7XBbO8oq4bNleJJOGlBdnjS5zPfg4nCHDu3id7+xiU2MOOIo87g5DHZxtmVY2vHPK2ZaQVN3yPV+KQvGm4MMzvWCnOynJLqmuQKRbglchnUTVmEpeCXredVC5WEnhjCXJF02yh9wboaSnoedMa2F6CZUIlHvc91b2VVgJn3+DrqiYxqXmeKfinRc2mke9tnO5JUfLssUwh1/OcKJLJMEkqgsgl6MmFogch3k6f8Sjg4jJjNss5zkvuhR4jV9PTip7WaPP8ynHY9WRH75hzXtU1O65LZdYb67pGA2Pjta60w/5exGJZoCIP1QvEHS+KoGlwXEW1yhj0P85UPNCOJtABeVXgKTnXyAsZ+gOGfp+yKamaGgeHYdDnM3tHvH91zen5jOGwR7ZOYV2SFQXnx1esDHy22WRSqNqWr7//jC8e3SOrKj61e8Dpifjc53lBbxgT90IurhfiMBma4uspvL0e/ansGdurTAqYr8iXKcU8Feham8JXtTLVF40UcM9M5qUw4GUXb6Z0kGKdminY06Zwu1vSn7w5qtvhN4ZwZ0knpZGvOXSGPI2Z7vsuzDJYlJDLe0dR04tDmrxitUrwPEnCa5pGCNa+RxwF7O6OCU0+wGAQ4yiH2WLFYrmR9cReJCZB1ibXU/IaQQp/2WxtgKtlBpHLcpUQBj5sSjaXazxX2Oi+7xJHAUmW4/kuvV6IpxR9v0fsRuZtqgn+JdDRu66iNp8pgSc57b6vub/fx/c1m7zi8LBP5Gsmk4jRKOTk+ZL06XswPZIi2dRUywVcPqXJMqpK5Iqu56K04vzkmpfuT/F9zeF+j4uzBWEUGtJbBANj72pDaADCAe6Nh2I0M5hKwcpWnQd9YbLoHUcm/CKVpiAQGS/R0EzkJtHOMudtYI1lt2craS68UIpsf2qS6vqdZG1yo9OpV4XczvWloGqvm+btLjwXRQLpslsvhP3t8yTnZwT9Pm4vJh72Tay6wt2/jTuayNqqL3t08pTkakadpiKjC3qyltgSB00zYvf1SneqAksU3IgNL0FMe/4YAB0Zz5mqhqbB8z2UVsSxT+Bp+qGLqxWBp/FD//e6dIB/3kR/vRGY1nGoVymOVqiejzMa4Y1j8pM5qhfQHj9H7e1IARgOGd4YcnQY4zjwA7fGLJYFntlhPzpes1gWVMsUdxyLZ37cpz09JfkHvwhNiwpciqJBaYdFWpI3kigHQjBb1Q0ODpWBzUOlGGi1NbfpGVhfWMBQNA2nZcXQJMD5Vj9vIPe0aYmUojQsdYHp1VZ7HymHTV1vNfLXlUzOoXK2pmN5KyQ3q9O396takebZHXlhNO07rmJZ1VxXFUMj7XCAZ+uMohR/5d9564rffWdOUTYsF8KuvrzMSC43FKcLvvH2Nft7IZuiIjBIRWleS9E0HPgeS4sgtNIILetmKwEE6PU8FrUJ+WkayrKWfXJe47ia3d2I5ekK5Wnwfdr33sOJ5QO6Xme40YdfXB+HwxZyrRRnySUDv4d2NH2vj6995vkSX3koRxFon29enPD9t25w43BHnKaU4ujhIXle0pv2qKuai9mCoigpywrX1ezsjoi8kMeLBX//n/7O1qGqtIiOYewHvVCKVl+86eumpqxqmqqRImthIat716ozyGlbSakLVBduY3fqPQ/yqpvytdMV7cpM9CAF0srnQH77IzPJF0Zq5zidLK8vufXb+5TGfa9BbhNpSCvqRcZwT8iL5A3PT69QSnF8dsn7T89498kJmzQjM0EdWVaQZgV5XvLmO0+ZDPusk8xwEyohKsbCA9DDQLgHkSvnOA7kPVHOVm4XRwGL+Ro1EEKjMkPJap2ilMONgx3m8zVaa24MBjxbXVG1NdpR5FWBrz/ezSrAZlNQ1o2k1V1uONzpMRoF7PVc9kcRedkwGQTbqNKiqNnd6zN48IqwyMM+4Y070DZSmJM5+WxGnqRkSUYYhRzdmnI0jXn+fMXXvvpYzHKqSop8kRmDmraTphlyWVUUoryyfu2jAzPxG+lcU4tznfWwb2p5DHubbC3FsalN4VbSHHiBWQn0X4C0tRTmyyedX74fy58ykyn6RQlbU8t9rZwvGnbe92G/C9WpK/m6P4ZkKedXpORZTnV5SvLsEZw/huWlvF4tMsC6qplfGo18kcq52dhc6DT948NOupdvpMjnSdcwZWsYGZtg7W0bn3q9hKZCu5rhzpjl9ZJ4EOO6iuOTFY8v1hRlzXyV0+8HH3r9fLRhTlmD7+MfjQl+4JM0RUVb1DCe4ty7S5tVOK5m8+3nRvKypnp6RnhrymJZEHrirJTXDU/ygr1BQGI6/O984wIV+fI4V+ewXBJ9/iW8nT6zp3Pu3umTphV7Y4His6bhsqzIm5a7gc/jvODM2Ns+LyrSRibsO0HAk7xkqBVXVcOyEvKc3btb6N+S8EauZs8Q1qzm3b4ph77eytKypiWppRHwHAffDBzWVjcz2v2kaY1jXs3G6N1bIDBF/7IUtMFG4jqOw1VV8W6Wo5DHfHeW4DsODg4tLZtNydPTDU1SsLcX8eTpivX5mtBxWG8qHIyVOXBeVoRKbHfrtlsbNMi5hkpQi7SRdcF8XXBrHNECO4HHcVHy6CrhapFzdZ7gxz47nzqSpq8scX7oD9G++y7lxYp8mXF58vFO/loXKS0Nu9GUh6N7rIoNSZUSuzHTYIKvPFylebo8ISkzHk6mfPXpMTu7I9nLxwFFVZEkGZuTpZi7XIsz3Pn5NXdu7vOp3V3enj1jkxd89uU7DIY9Ti+uOdqfkmQ540lfeHVFCddCmAl2ezSPV+QLE/mZ1lvSa3/Uk2k+kN3zViMfSqocDlLcHSnQXi+A2MMZ+VKsM2ODG3uyy9aGrZ7XAn87pgGwRL7EMtrNBG0vqLIRZn/7gsRNK1gUL5j2yBohSXOKk/V2XfD0/RNC36dKCtq6ZbVOOT67pCwrRsMe8+Wa1XLDsB+z3BgiVqDEqCer8PohRC5N23TnWrfyfsSuaUAq8DRFWbG/N6ZJSvo3R2TrlOurJXVRcXY5p6pq7tw95Hq2JClL/rVXf5r3F085S67YlClJ+fFfP7muZhR7HOz2ePX2mGVasFjkHA48Pn97SGUQyLffm7Fa5eR5xftvn9Ab9mRSBJq6ATegevRNgdUdBxzhWAwnffp9n3dPlqRJzsNXDinzks3JCaPdCfghwXhkpk+PbcKY0nDxmPraqCS8oDPIGUxkB29h+3DQwdbW6Mbkzm/Z79ozxL6xPFfQkwKpdAe1WzmbheDtkS6lsCot55EupYBbol9Td1+XuZyb0tBUcvu2gdULjQLA9ZkgFVbbD3DxmLIoCeIIkiVtWcDo0AT3xHJf66vfth3L3r5f1rI3HnWGO6M9SDed7M+iTI4j5j0Xp6SblKO7RyxnS8JQ85d+6iHzdcFqU5BlFZfniw+9fvSXv/zlD//H3/mFL9O2VOdz2Gxo8hLluxS/+wbz336bcKeHO4pJns+JDoe0SYrenVA8u6LnOXjGojPdVBxEAZebglhpkrzCdxUD3eDpGqffg+mU7De/iQo8wkGACj2couLpZcKqrgFn6zwnTkHyOZM1LbueS922LGqRtCUGtnccGLsaX0mC3eOspGyl6O64mvfzikhL0UuMxn3qSeJcT8v0Gyqxwy1bGLqKFnHKa3HoabXVqq9NHr1vEIEW2ZGnjcDlWSNcAft5Na9F139Z1sTaIVBSjLXjMKsq8zzCB3grzelrjVM1eJ7itcs1/cZhXtS8t8q2jUOkFOdlxYEvaXaxViL3a1tmZcPIVfS12j6/5zjCQ6hg2vc42RTc7gVUdUPsaoJA0x9HqNiHskY1JWoyguGA+slz3Mhjdrpm76/+h3/t9/Nh9b043l5++8uxG7GpEq5zSYuL3ZCf/86vMi9mxF6Ip11m2ZLD3i5lXeLohrPlGo2DVkp2vWkGvqbOKsYHIxarDamRMaYaer7P0WDAV9+WSWgyHqC1pqpq5s/nZgplGytbl7UU7aalKWrUNKQtpUgXaSEXd2P+GOIZjmPgdKdzvUtqGhcp1iuz/49d8YP3HFiawli1UvwjtyPeVWbXbhEEK6ODzinvxZ9by13HXMR5LQ1HUslaYujBqkQPA9qsoqwr3MhHaUV1Iel8CocoCnj++JxGO6yTlOTpgrJt5Bz7LuQNTaSFhOirjhRo1xahK89tnPrqWUbpwe7emOtHl6hhIGZfjoPvuQSBRxQFOEqxynIW9SX3xoeUdcVePOXt2Qk/++BnPrbXMMA/fOP8y3XdcrnM2GSVweNafuOtGb/95gVx7ONqxfOzNa/cnzJf5vSHMWVZU+LReoHo6tdXsg9fXuLtHtDUDXVVU1cNeQlh6DIaR7zx+hP6oz793R25n9Lks0vzf19+kETXNrKXbxsp5k3T/Uy78pzxSGxrt370867Q+ZFxq2u6Xbz9eZlLdHddSYNhm4PeRB7HyvAsGz8adRB9OJAp35LxqrxbD0QDub2jusJcl4bn0sofa95TFfK9VQf0p+LCO+xTHL+Lnh7SJkvx38/WMo378QcbDrs+qMuuWbHICK045xmEwt2/SbMw3v203f38iPHOEEcpNknFrGzQWqTSQaB5761j/tOf+8LveR1/5ESP1hDH1OsM5bvCsFYO3t6QNK1oyxr14B6921OSrz9m/vpT2tWa4DMP0X0TMLMoWBg9ud0N503LN1YJjx6vJNAmjmEp+eZNWeHvDajmCc/PEoqmpW3F4z0yOfNPctnJ349DBoZhX5hpNW3EDCZp2u3E3VNKXOWUw93Q2zrUHfoaz3EYaYXvgG+mfN9xWNUNsVkBCBdKrHGXVcOslKk/qRsGWm0NaTzHoae10dk32/vZnlMBm7pj+PeUFH7fcSgNEpAbRr7Y7Tb82F/4IrFyOCkKZquC154scIBH64xVXTPUakswBNj3XVwHlnXNoqrxHOEkvBT5vJ+VOAjPoTGPH2sJATpe5nzx1SmzVFzO1lVNWTW8++1L6nUGykENeiI3MVKt9HyF7328Wfd1W+Nrn2WxZuD1qZqKvC7Y7/VYFSl5XXCjd8TN/j5LLNobAAAgAElEQVSrYsNVtuBz+/f4sYcP8X0PrRVXc0l483whrc0vlhRlRTlPefT0jN04xteas43AdZ4n+fNpkpGkuUzisSvFfux3Gvim5eZLR9DzaJZFB0lbEtAL8jkVmP26r8WhblNtzXRUYCbcQHcNgqvEPMeuBFxTzO0qIKvkvJKqK6RWO28fp2ox3aecj7XHrZvORMfu9Ef+VhVQ2xStQmRf//HP/QkINO1FRpoXvPXWY8hrlidzWVtMAkEltCN8gtAoBHyzurDkwmkgjU7ddO9NXkPfpThZc3Z+zWd/9JM0TSukx7pFa8Wzk0sSE6/rBx7TqE+gfbSjeG/+jND9+O/oB6HHwShgscgZ9XxqgwYOBj7JJifwNN93e8jRUZ/3ns7ZbHLu3Bqxu9vDC0SCRWGKnCk+5ekTKcaz52xOT7hxY4Dvu6zXBV7g0euHhJHHcrYkXxtLWm1SEQ/ud1r5MiO4/8kuRMYWODDa+mnnP2+nfUcJxG2h+brq5HeOI4XcptJZDTp0kjgbDGNz4e1kbx324nG3NmgbmaBBzsv1O1gfOtmeXR0UqRTrdCkNh2kE/uK/+zNdCI/WrN9/G3oT6oWB7Q8fimJA6c5LwAb0lJncN+iZlUdlJv2sUwYYl7/q4jn+rZfM/fLt8zebFVdn16wXawkMAqZGpXVysqI/6n/o9fPRhX4yEf95wNnfg7oh+uv/BcF/+7/Q63l4e0Oc2/fwX7nDs0dzNptSJHgIYzwvahZpKVKurCD0ZKK8qipu+z7rsub4196BsoQ0Jfr+V1CuJns2459864LjomDoilWt6ziclxWPspJDX7PnuiyKisQY32gwsjGHJ3nFnifBMvOqYdM0PM0rbgceV2XFZdnwrKhwDTxuyWuhI8Q6+3xl23JedMx7K6sbGplaXysSI9vzDVGvReB6azebGwJcYIq754jtrm/+VsBpWXFhCEc1XWNxWVZ88++/huc4BrUQEuC6bniclduJ33MUE1dzUVYsq5q0adj3PHrGIChrGrJGYnM9x2Hqao58j6JtCRzxAAiVIkkqDkcBsVYcjEIcxyEpa2aPr7l890ryCNoGx/cJbu8S7vS4+2e/+Pv+sPpeHJNgzKZMyKuCnXCH62zFn33wZ/jPfug/IXI9POXhOi570Q7fvnrMpiio24Z1kYlDHZAn4sVdrjI810VHnhTqQDT1r337fZTjcLne8MrdI5RWXM+WvP/4lPxsTW8QSWECKb6rUohlQ5/Ti2toWpyB2LkSuZ1mPDbM9qaVRuD5RordeSrFbpZDoAURGPoSXWuT66zszkGeL6+7gh+ozhzHFnXLZrdOeVZ3bz8h7O0t49/+LKnkMWe5PEf5gqNd6FLMEv7er/+W/HxgCk5q1hC5FGzXd6lq4943ywRpKGrCvb68LksExJxX1eKOIzlfkwTJJMD1XWazJeNhD7cXEPdCdsZD6rzk+nrFs+cXIvFL13jK42b/gGHQ48fv/OAf7EX4XTgCXzNPxRp4bxjSti1/5y//EP/g3/kS/UGIqx12ei6ff7jLs0cXMoC2LWHokq4NDG1z36+eyc7Y9aVouT64Pt96/QmepyiKmnsPD0g2OVdnc1icy946iDqf9stnAjP3JjDYlZS8MjOyONNQtE03NYf9bopPlwKD26z49Uz2+mUukjLLTreFuswEFbAwvN2nB3HHlm8b+bkXSMMwP+nMcarC2O6qbjq2u/rSsP6ThTQVizMp1FarnxtovW34hV99d8v014aYx+a6MwKyKgDLorce/r0J7N7+oIGP5RdMbsjzN3XX2PiRZDVM97YGO+5QHAKTy0uqoiLuSeNT1S1feDBlNAr5kz/9yQ+9fj660HsepaXsz+eSN//v/QeAQzQMcXeGslfwPHZ3QnZujSTIZjol/NRdirwmNO5sCodVUXNqGPJ527A78Nn/1KFICVyX9a9/kyYveetrpzzciTnyPdZ1Rx7acTX3Q59d12Ne17y2yQysLzGvSdOQNi23A5eN0a3njRTrvG3Z1DXHheTNT1y1Jc1VLey4LlnbCEzfCOu+p4Tg17Tdjj02ryczBjqekbHlrbDqz4uaynSXtWHmy3M3XBvylAKzy2+32v1YK5JaAnAiJVr7Fvj2KmFjXtezomBe1bhO12QUZjK3qIK1vM3bhsd5wWVZMXFdZlXNnqd5mhe4jsPTvGSshZuwrKU5ePtYJvTd3YjjeUae17x0Z0gUuezen1K8/ZT2yWP5RRiPqVcZq195/ffxMfW9OzzlcZKcSvBKsSCtcv7tX/5rODhoR3MQ70iuuhsyDfs8GB+RVTmvTu/w2VfvkmQ5buAJTOxrfE9TL3OU8Z0f9GPu3T/C15p+GPDaG+9BC9957xlHhzv4+z2SLO+IbiMfdkPxsV+X1M9WMM/F5W3kQ1J2lrXWic6Y1RBo2Y87jsDXvsIJtaTcNa0ExliJXFlLgfS1kOYCY4hTtZ1crqUroFXTueWlxm3PMvRflNxZYp+DIfCZx1NIYwKdRt+sBt577dH2OZbvX8lrqczzBnq7x9/67pvHz5YpzdOV3H7gSzNxsy+kQpDvp6EgGtc51VXC8fNLtNbsToYkScaz00sePLhJLw65d+eQs/Nr3rq64jy5omwqlvmGr1+8+Qd6DX43Dl8r3j9bobXDMikpy5qf/q9+hbppGY9DXj7oMw49PncjZro/Zv9gQF7WBIHLnZduUBdFpwO3k3Oy2Ba9wcEedx8eoRwHpRze+84JfuCxubwiOLxt3ORU52AXDbup28bCWna9Zb3b6TuZd1p1O9Hnm87AxnrAuz6ka9mJ55tuV26d89qmY6p7YQfzWwg/7EtxXV7I1+urzlbXcgrsY2Vree3QQf/WHtfC+doz2n0PFmdcvP5VswoIqJ+/IxN7ayD+sC+RuXa3b/X2dlqfPe8IepZzYOV31ss/Hm8tg8tM/vTvPYRkQbVa0js8IhhPmB5MWc7XPH48p24asrJhPs/47W+dfej185GF3jk4okkL2qqmXiU4WtH/wYc0v/x38PYGOK9+gnZ+jXNwSLQjpgJtVm418Te//xa3b/Xpa0XeykQ51KJRH2jNzbtj/B/+ftrFgvzNx7RVzRtfeYLjwLtXCd9KMt4zRjjKETi7aBueFwVF0/By5LHruoTGAMZ1HMau5jiv6L1Aoksb0crPKtHhH/qy0xc2usDYnuMQOorAkaSgvtHYT70uMU6Zx1OIpW7StFszGns0dIhnoLpGIlQOkZL9fWPe+LEr5xy/YMSTNy01ohTImnZ7nqu6ZlbWBAZRCJXwFGLzupe1yA9tyt5A660Pv03duyprJq4mbRpu+C6ruiZrmq1qYX8QsN6UeK7DxHeZZyVPnq548mRFOU/w9kySXdSjfOcJ0UsH6I+QdHwcjkkwIatyqqYmr3O0o/jpB5/m0foden7EQXzIulwbFr6wrz3tMgmH3BmNeHj/BrePdqVQZhVZLlGpDg5O5HLjcIdP7Oxwul7z3qMTwtDnW99+n6ZsOHn3lOLJgvb5Ri6CwBTvdUljSXYHsejEY49ez8TBhsYBL9SCHGxKYdU3bedpb/Tw7aoUGH5lpgXHQPSpmeqTqnPJs055Vl5nDSHattvJJ1X3tTXOsW56lsTnmPVCaUh/jfm3lm53bx/DkgoD3Z2LSeSj50rR1g6OdjpDIMP89/uBQPkWogdBNYY+1TyFmz1Y5DRlA5HGmQT0BxFpKo6HYRRQlCWnF9e8//SMi4s5w1GPvTimbVteP3+T3WhMz4v+f74q/78f93cCahOOtUwLlFL8yOdv8O3nK6ajkM8e9lgXFXcHPUajUP6rqoYbk5h+P2C0N5WpELpc9+ktKdhlhtKKwcBnucqZXYjb5fGb70G2Jn/8phSx6xMpZE0tBdsWYkeJHG56UwqlHwl0bW1rreVsvjE7bN0l11n2vXWIKzO5bqKhNAyWdGejXaEr2paU19RdRK19rDyRCd+uB+x97X3qstvv26jbIpHbu36n97fnCAblMnv6POmQitr8LrtBt85wg04eaDX1Fsa3z+coue2LHIOtPXAFeUqZl7iHd8AL2MyX5AtJDQyigDj2iXyXt0+WPLw/+RfPo7/87/8u1SIl+vRd9Gc/Rfi5hxQnM2hbvDsHEPWofvcNef+GkUjmdvpiMBDHeNM+jSmMbQu9QKPlI5JQK95964rmm2+Q/sbXaYsKdxxz+1afwcBn7GqOfJdXo4Av9CP2PI9AyT68pxUDrWlbWNSyh87bhme5JLlJcl3Fpm458jWREg9618jf5lXNum5JmmZLgLsoS/P4MiHXbcu8qjkvKoZabWF7sEx2KfrzSlCArJEC2xiuQN5Ic+AgcLuV5pVGuw8itVvXDYFymBnJXtK0LKuGohUFgDY79qoV3/4WmLrudoV6XYm0bs9z0QiZ0Br1XJXGS7+qGLuKXU9zUlRb17xAScGauMJzeDRPWecVRSnRtrd3Y+7eGTCdCExUXq0hDGG9wPvkQ/Rk2O1vP6bHz7/9C1wk1xz19tgNd/ji4ee4zpYkVcIkHKIdxWvnb7Au19wcyJ4+0D5JlbIbjej1Y2HLtzK5ep4rbnBJQS8KeeOtx3zj7JyvvvGe7IB9j+l4yHjSx5vGsBfh3xkRHPRl4i5lytU9X+DnooZVSRgHbOYJzHM4TaXg2cn1RekdmMJnCrc2RTz2qOdZ527XtN0+flN2iXW+7oJi7GNmdWe286LO3urX6xfkfPY+NgWvxXzoOi+E6dC52ZlmQZmkOUv8c2PfqAyAeYFylPAXGkPIWxaUVSXP2baCZEwDOb9zkfGJnt6FoiaYxLRVy/piRV6UFFVFVdWMBj12JkPu3txHa8VqmeBrzenmktvDI/bjnX8pnPH+5s+/weVlwr0bQ/7wy1P+6GcOuFhmnCYpO4MQXyl+/munLIqCnUnEapWzMwrZ5BWjUUAQB1SZ0XEP98TtzTrF9YasrlfM5xlvfuMpVVUx3h0zunFAdOO2wOw7t/EO7+BMDgy7vAdBjN690aXRJUv5enlu9POLD8bJalfu6zhyDrZ4G2Y5m2uzA990k7rV27etwOq2wNcmsMc66NlVQdt0yIMt4lXxQa/7ujKw/Qt+AC/a7W6uDWTfSrHeXMvfTd1N/ZMb0NSo3ZvGHyDpbG77O91jlplZKYRd02E5CluiH10jZF+XyQko8kK8D4oMx3Xp7e7Q1A1VIWuc48sNWjmMYnHt/LDjI1n34dXXv6zbgnq+xplfi5rn+Aq3ScB1cXb3UUUCsyvcH/5DVG+9K7+7WYKzuwObDaoqiZyWaeiz3ogm/rKq6CvN3dt94ru7eDsD3Hs3aS7mvPPGpQTP5LJ/Py8rnuQlDZKh/ok4ZFZWpE3DyNXiEFrVPC9qWoSM5jhs5WObpiVWik3TMHG1qIxaCXkRVr3srRvEG37VNMyrhqmr0cZhr6fVdlgpW9m/D11F2UJmeVPm7wYozNDVIJ9tgmw6W5mzb6xzxXBHbHJD5VC0bKd+icrVosFvOkZ/aJoRe5+yBQfHIAOad9KCUMvthq4Sv/6mpaelmbAuefZcAuWgHYf7+32qrGbc89BKUZQNWVazXJWUZcN46OOOYnSZwnAo+/o0pd0keD/3Vz62jGXfa748y64pmoJ5vsRVLrNszqpcsRtOGPgD1tUKT7k8GN3nW1dvoRyFdhSzbMmmLNjkBXEvxA098rygzipIKkoPXr5/k93pkLgfsb8z4mq25L0nJ3iuS7pMYVlSL3Pqq4TWk6n18N4+q8sVzbLAHUc0bUN1nXZRrMaZcMt2t8YxVkriGFKdCZaRGFpTgKvGyFHqzurWwvDQXZAtXYa9heELQ7J7EdK3DHteeO4XezsHOY+8NgY8TqcIUM42gKfNjM7fFwld45ji7YrRT1vUqNijdRWcmfci1F3j4Rt0w1OiQrBNQymvt65qDo52SOoS19VMRwMBHpqW5Trhar5k2O8RRT7LPOf2ZExZl5RNhVaKLx388Mf2GgbIA/fL52nJclNwsS6JApdvP5pzlta8ctBjv+/zZJETBZo/8+lDvvL+jHVS0u/5Jr7UoSgb6d20J9aqVlaoPYb7U8IoIOpH9AYRi9mK5fExtaMl8CVd0qSJfF0Ke13v3hRZ3WYm07stYuuZsN8tq9z+3PrON6ZAp4uOqV/lUiAbYyyjRMfOeiYZ7SDs+7Y15jfrbjVglQC2YbAuedoFHPneKgHs5Ow4ZuqOOgg/GpikOadz/YOuoYiHglS8IDdti3wbUyu6/7zzz68KQU9ou2CfqpDXtiXjmWAfqyZwFEwPDRk2JxiNcFwfPJ+2rinXa2ociRcuG/b2emR5TdUI8fQv//Cd3/M6/siJPv3113D/8I/guJomLVl/7T3iP/1T8m9ff1f2847D8S99HZI18U/8IGiFunkEmw2MRrjDmLDnMxr59HoevnIYac144NG7NYHbt2EyYfOVb1Jcrnj5lQmHRz0CJQS0QDkc+S57rssnopBVVaMd0YkHxo3OdRxuB2L1OivrbXxsafbWDTBx9dZsSyEwfmMK93kpu9Ckqekrxb3QFykccvu0aYwhTctAKw58vTXTGWpnG6gVv/A1sLXWFemb7M6tTt82F6Lxd7b7eosWjAyb/sok3I2MTDBwHAITLBErhQKDJEBhIKFZ2XBVVcyrmruhx3UlTnyi4xfuwnnZcFyIB/9lWfG7J0t8Q8jL8orhwGc09NndDTk6jMVAx3dp8rL7hRuN0MOPN+z5O2ev8+O3fpS6aUirnF9877f4gYPPMUsXvD1/zKba4CuPX3zvNyjqki8dfZ7IDRgHQ270d9mJIgbDmF4vYm9nRC8OIdA4OyHjYZ/hqMfL0yk7vZg33npCXdW8+vAWu5Mhbuyj9iPZXR8KQYydkNPjS6wvvVKOTMexSabzlUz1YOBvRA9vD1tkXTNBV2ayT6pOa+4q4QH0TPqd4oNadGtnu31MM5FbpzzldIXanoc9yhd2+WA62xeQhaKRn9nz0E5n1GMCcfR+3EHyPc8w/h1ZZ1gtf9UKiXBdwm4k3AUr9Vub8J7cvO5K/j575wzPlcjnTZrhOA69KCQKfe7e2KeqKrTWOI7DZZJQtw378fQP4Kr77h//628949/80k2SpORqkfH6o2v+o595iSdnK3759ROSsiLJKv6H/+1N0qrmp75wgzB0ub/f5/IyIY49PN+jNx0LkSweS2GKR0TTCaNJnzu3RgyHAefPr2jqhsGNG5LVHo9EJw4mbvYAhnvGzKWWid8y7V3fxMMm3b7bcbakMrxQit6L+3HrCW/XABZm703k+Wzqm93pOw7bcB0bTmPDYqzTni2gdrLXbtcUWL6BJfq9WJTDvtmvhx20rz15PSsjL+yba2abW9/IpJ8uOzWAhfTt4yXzDsF4kVtgVQbQTfcXTzr0A7YhWkEcEU0nOI7EMTd1Q1HUuK4iDlz60YcbP31koc+uNtS/8U/x/tyfBcDfH8KdBzg/+TNipqMU1ZNTDr9wi+Yb3xT2fNOSfOUb0DTUj5/haAflaoKDIft7ETvDgIFWzFcl5flSdhHrNSryWK1Lnj5e8rV3rjktShzgbuCz77k0wHlZUrSyt25oOStKtNHXv5+VLKuGiZlYfSV+8lkjEbMgCXWhKdj3jNWreNC3VAisvjR7a8f826Hv4hmCW2F08SeFTMmxdrbe8oUh19UGvg+UyNiEr9SyqmXnLg2Hs33zbVJeX8sUPzSPCdKsjLTiX7k1pjbsfdHKl4YjIPv1wBHk4VFWMHQVu15HylvVNbcDvd3fV4Z3cORL43NZ1hz5Hi0t76YZF2VFWbbkRc3sWgIywkFIvBNTHF+j4hBWK5wwgsoYKH2Mj+tsyT85+U1+9v6fINA++70et/u3+GN3f4KyLlGOZp6v+MzeXY43x6zLNQ4Ov3XyBrEb8cbpuXiCpzlh6LMzHtCPQwLfpyzFsWpTlsyShCj02SQ5xyeXPH5+TnWd4nse4U6PMA4IfNcUZETfnlYUTxYCkTuOmOmkdedvb6VtddMR3QBj9yiWuKG7DXXZMvuLGpZF56s/Nixmu6e3MKZDt3qxu3qQ57ZTvfXHt0XbRuS+CBPa+7mmYei7gkrYpiB2+eSPvrqF/1ta4RTYFYNvZIPKEfc/V30QpVgWMA0/uKv3VOfqB6jdCIY+xfmG/HLDcpVQlhJsU9cNZVUzmQxYzCUbfJambMqM5+tzQsvy/hgf19cpf/+1M/7mv/EFRv2ALCs5iEP+2OePAOh5Lkle8dM/co/fPV3x3sUGpRx+881zvu+T+5yfr2nbljIv8QIPtxeLb7vnU2QFeV6R5pU48OUlm8WKzXLD/DvflinX88XUpW0ForcQM8jXy4tuV54susjZzXXnmFeX3S7eptNZlrolr9lgGpBCONg1u/JKiqlFBSwhb3Pdkewio5u3UH2eSOG1mfXWH79tu+Jqm4eq6CSANlJXe51dbW8CQY+jH/ySPFaVy79lq44x78cC3YOsGfJECnaZv8D8dzq1QJF2Ubm2obBkx80c1jPy62vq+SXVZk1++pSmbvBDnzzNiWKf2Swl8DRpXv2L7+gHD/fJn17R/uN/hPfv/1X55T4/If9b/5PIN65noBX6YId6nVO+8S7VbEP8J38MZ28PPeqjd8dkqwx3HBPsDQCIfZeX7g/pf+kV2qfPKE+uaMua3Zf3th4dP3g0YlHVnJUV30oy3k5zgbINtO05DkXb8q0kZ6g1D0OfWCu042wlbyATt02k8x2HHWMa815WsucJUz/WDqdFvSW2DbVmWdfbJDoQclysnO0O3jVT+KySfXfadP72QqaTr30znETKMbwlKfpV224helvYXccxaIUyMjvRyr5xtt6irWuzU9r1JIa3aFtqWk7Lih3PZV0Lb+CylGkdYG0c/ex5V6YR+XQcMnYV72TS/cZafAre26SsNqXI65KK508WrC82eAdDIVqendE+fQJxjPuZVz7qEvqeHzvRiMtkzq88/1X+/Et/jp1wxHV+zS89/se4yuUivaBtW1zlcpXOebo6ZVUk/JE7X8TT4pXu+1Jkg9BnOOqjlEIrh5ce3OT+3g6PFwuuZytcT/PSy7fQWtPkFYcPDsg2GdnlmuzJnM3zJf4kwosCiF3cUSiEuYV08u5Rv/uN3JQmqAb5vWuR27YIpD304TTpiqWnpXjaohxqA+938PZWXlfU3V4prboGoWmlqbAXvf3gqF9oAqCT2nnqg58ged01D9rA9Ebv/uY7TzqL200pzxGbxkebAr4wBjlgJHZNN+HPjVFQbHa6K5nwg9sjaBDb3LKGSONOIqpFRl6UVHVFmhWsNinX1yt6/YijyYjZJuHxYsFFcr31vf84H7dujZgtUv7i//hP+W/+9GfYn8a8v9jwf772HKUUXz9dC9rbNPzady45m6csFjmff7jL88sNo1FEf9ijyAuiXiSa66oC7THeG3N0NGCxyCjLht6ox71P3EG7Gvo7eHtHBr5fy2RaFfi7B1JYQYJY7L7fFZSApu6iYi173fWlgNoi25tIkU+XnTbeatztLt7q3e0E75v/K9skbB9TiSdAXXb6ekuSs0dVdJK/uuoey5LvLCnOCzqHPdsAZOLxf/La1zonPwvp29XD1nCH7nWANAMvwvStgfKjoawm8k3nGGiliUrD7q2OoV8VEI+pq5p0kxJEAb2eT5ZVnJytubpOmfT+Bb3u67SgXufQ65H89b9B+J//dcr/+1fJjq9RkSeRsgA3b+LuDMieXuH4mvSXfl0m/N1d2jRlvigor6SjrOsG31e42oE8x+nFuMOIap2zfjLj7r0hrxwNqJuWG4HPUGt2Pc2twCOpG06LkqJpWNYNY635yZ0B66bhrKzwHZhVNTWyMx8ZM5tlJYx/5Yi+31cOtwJ3S4yrWpi6iid5afzgaxzgsqrZ1LL7TxopzEVrJ/HuArIZ85H5YNQID6CnlJHoyffK7NI9x+HQyIheLPKdD79M/yNXs++5tAhisKpr0zQoEsOcT+qGi1KaEgdJ2FOOw9RVKMdhWUmq31lRb61+AVyEye86DjuuK2z/Vt6/smk5Me/zvXtDyqImiqTYVcuU/OmVrGZOT3EGg4+6hL7nx2Uyp24bPOXyX3/1v+OvfOov8Vtnr7EuUwLtyzXZ1oyCPsOgxyxdkFU5b1y9zTLfMAlDiqJithCZaZYVFGVFEMhEP88yPKWIeyF5VvDsyRk74wFHN/ZIspwgDmRa7QksX1xsKK8TAKpFhhoF9O5NIKmoTtdS1DaGnV6Ynb2iM9Oxu/a6lYneFvpASSOwKuVCWBtb20UpTYOnPrhr/71+861jnr22pXJ0UbftP1PwbZSsRQVCmxHedM51A494xxh52HVAXouUMKul2Kd1p/PvucYsR8vXypFmZCChNVtDi0hul5+t5XvbURcN1SKDsqGYp7RJxdH+VHz4ESvZp+dXpElGWpYcr1ZCBPyYHxKkVJImBf/W//zb/I0//kn+j29ciIFO3+d0VVA3LYPQZdoPOD/fEIaabzySZL7aqoOSFWEcih2u1pAnVGVFWUoiXl0LJPz03RMGY1HZKKW6PbiB34uLEyn+yuzwDx7A3h2ZvG1+/PXzzvoVusm3rrq/4QVzHNXJ0ZKFFNsi6UxjUlEDbPPsoTPUgW4q9oJuVeA4HRsf5DnsXn67Wvhn0KmqkH9Pl+a6b+Q+B/flcYq0Y98XqaAKljtgYfmwLxB/XQkqoV0p6OHgg0ZA279V5+Bn36t00xETHYdgOKBaXqO1RinFYpFR5CVVVbPZFCytUdXvcXzkFe6OTMcTRRJH+9Vfw713EzfyCT/7EJIEFbg4owkEATr0iV65gb8/xAkNu7JuJH3taExbN4xHAZtNifJckrdOKE+uKK/WBHsDBg/2oGkpi5rLecaiqinbhokW+DxQilgpPOUw1IqsbXic5BQG4h65mqqFiSvFtWgxefJS8F8MqYmUFD8EkbYAACAASURBVMLrqmGkFQ7Q14rUMPGtBG5WNVsIvTJ6+HUtuvXGfOjZCblqZYfvK2cbrNPA1p62Z+x2Q+WYKRsz2cvfviNSOAXc9D32PI+1ybMvWmHj121LaqR330kLA/urbTRv8cIHcagcNob0dzPQnBQ1G9NENLD1KFjWNUMtTcW9QcirA5HF5E3Ld95fEEUuKvKpFgmOdghevglRRDOb01bVR11C3/NjFPbJ64Kyqciqiv/n5Fe4M7iBr1zuj25TNAU9LyZ2YzHPUZrDfsfEHochSjnM52umQylYe9MRVVXjeS6Pnp6x3KTkWcFo3Gcw7JEXEt6yXCdiuuMpwp0e3igSGD32pOj1PZqrjM0qlUI1MCY3is6atmy6opeZnTxAz5XAlxbZVdv/dgu7+2pLdNvK3dq2g+/tFK9NYa8a+dpI24COUWohfOWYJD1DErQaeyvp81TnrtdCcNDnxs09knUqPgFNK/cJdGeuMzcQvm0wzL7+A+53LVvDHYndbWVnb1357L9rB/oe/iRidGcKZYM3CDm/muN5LmHgkyQZSikGwx73xmMi1yWv8z/Qa/C7ccSBS123JKuE1Srnf3/rjINRRK/n84P3JgAMIw9XOYSelmS7GyOOdmJ8X6OUQ5pk0LZo25iZ4hf3Y+bzjPk8JQxdJrtDesMem5XovvPZ5da+1pkcCCkNDIRt5HTLS8gStkE2RSo8gLDfPZeBv7dyNVuMLVHP7udtIbSwv+N0DHYry7PEUCv3A4HEHaeDyq0pThB3UzIIpB6POu6AbRDsXv3FQJ6qELc7PxLf++VFtx6wyIFFD9JlN7Xbqd8iD47q5Hr2/Smzbs9vX49VBgAEEfrg7paTkJ8/B0C7mnSdUuQl/UHI/n6ftpX//w87PrqVtTg60PvMLYpf/EfUj4+J/8gPQBjSPn+OikPaqwvIMpq8pLpaoT/zSbG1vb6mvFpzdBjTGKmOUg4P7g+p8hJvpy/2upHP8pkY8uR5zdCwjhe1MO6/kYgxzlVVsaxr5lXDG0nBZSlGL3a6Ps5FRtaYfbrrSFFf1y1nhdjFaqSALeuaqauNvl0CcDTC0nfNZ0tiNfOtaNuTWhLqWvPGZY0UfllnGv27Icbl5pxy0yRY85zA6OVBCvGh7+I74s6nHEEHHoQBB5FPZmJsoUMlirbl3FjwWl992zAsaiPRMz72u55ICy1Df+gqjnzN2FU8y0tCpbb5JctazIxONjlnaWFCeRrq1qxWN7mQnN46pXh8Jp3+eAjHxx95CX2vj6ZpWBcFrtK8NLnJPzl+jVW55odvfD8tLaWxxM3qjKRMqduGy2TO3eEtJuGQe6M9FvM1L92/wWy5pjFWwzePdlkuNhJlm+b0+hEXFws8E+SktcLVWiRvxxuy4yXlyUrg9cQ41Z0l3R67bqWQG0h6S7qzMPimkuLnKpne1yX1Its64Dl2Qg501xBYuRt8sCC37QfheNs82In9RUMd1+kaALuX16q7j6e6Qu2Yx/AUNx4ccGN/StvCZGfYEemsA9/SuoSZJsR15Nw3Vbdm0I4Ud8eco6e6qF1roGNRCkv6m+UU61wmeOVQZgVFWRL4HlXdEAQe17Mly8WGnucxjSKus+Uf3AX4XTqysqauG4Io4HOv7vG3fuE7APzsF2+RlA1pIamTq6zm/dMVdd3y6GTJ524NUUq8QYqsYHTnNkVRUVc1QRQwPDpkfjknDF3KoqRpWuZXKya7Q+qqxnVdCM3O+OIR7epanPKgY+0vjFGLnbxfLIDWnla5AlOvr0wwjgmAWV0aMx0zZQ/32brcWV25nXJtIIzS8lw23hWMYc/ig4XbGvhYc5wXGwSLLmhXnm/rTKe6fPqmhp1b8r1v5HE22KYqOiKfve3o4IP6+yrv+Al+JI2PPU/rLWAbAOsxUGYC9QO0LXWeSXOxPBc0wJcBJB7ELK+XzGdrsqzEcWCdffjQ9ZGFvrhcEd7ZheUS9af+PE1e0uQV6o//HAwGOIeHEEXkX/maGOr4Lu7hFLIMkgTnwQP0UOz8HOXgjiS69vIqI9gb4O0O0HGAjjzGrx5QXq6YL3LefLREOw6Hnsen45CHoc++56EQDfnU1bwceey4GgUc+K6xpBVinVjMOtzwXZ7k8uL3jP3uxvi7p41o2JUD87rm0NfkbfsBdjyIjK4wevlQ2aJqyNBmkrafyZ4h5om/vuzzI1OcrQWuhdSnnpgGgbHV1Zq+VvS0YmcacvdOnx/60bskRu7XII2GPY+eUROY62Hr1lc0QrYr25bn5rUvqmbLGdCOw4HnEWvFum5IG0EoImW+N/wBQUda8rbhYp5xcZlSzDb0XjXsW61pV2ua6w9PTPo4HCebGX3fZ5Gv+JEbXyT2Qq6SOZ+dfh/a0Qy8PrEX8mhxTFKlHPZ2mYRDXEdznS2pmoo4DqnrhjD0GQxiPFdzfjlnujPE813atiXwPXZ2hizmazZpzvnVgsB38aYx+993E3ZDgptS8PQo7EhykRboOtCdXz101q7jQIqiLYSRFlje2uQ2Qqprl4XY6kIH01etTPuh2xX1Fxn31kHP191KwDrk2cneNh32j2f2/9Z/fuDL154SFCHUuLHPeDzg4cOb/JEf+jTX16vOUz81FsCOI49jkQvLF7BNiG0ekqpDK6p2W9j9/V4nQ7RmOwMz0ZQN9TKXcyob2uucxWrDfLnm7Oya6Y4gMo/mc8qm+ZcCup/NUpqm5f7DXX7mU7tMJhHvn6/42VcPGEeaSc+jbhpef/eS09MVB/s97h3J5L1Y5IzHIWEckqc5QeAx2hmCI7LEo9u7stIMfCaTCO1qzp9f4TgO1eVztOehxntw9LIU1tG+FLD+1Ey+ZmK25Dm749aucaZzu2z4cCCTvZ107RRt4fJkztaIZrjf2csCW3vdupLHtCx21++KqV0H2GbhRTMeC7dbyN6S+upKbmM18hZlCHqEgx73PvOAL/3EZ82ufmVc9AxzvyrkNdloXfv8lpxnbXytBM/15fVb58DhnjRBVo1g/fBdXwyKkqVRH4wF4VhfU8yvuTy5ZDQdobSiLJuP1NDDP29Hv0jFsSqKKP723wbHIX9+Tfv6b8CzZ3B5yerXvkn4R3+EtmwIbk3B89j8X79Ds05oNxuarMT3xY+7vFzh+5KKFtyY0GQF+fNrkrfPKC9WlEWN72mmsbeFoK25y7OiMBp1gb4XVb2dmkOluCgFfm7alomruB24PMpK7gQuE1Ncz8uanpLwGZn06+2O/LSoCZVM3rOypjCFc123xAa2V2a1KSqldjs5K0cMeeybGRlNfGiKu+842/CZSEmITmSahqFW7Hkeu57LQGt+7CdfYffmkMGPfY7V02tqA827jqwgQD7zV3VDX8tjOQ5bFzy7Bjgva6PFV9wKXOaVoAAXpeTT3/Q9pq6E+nwmDplXNbuuy3lZcV1VW7KjwiFyNVXVMJ/n5M+u8V+6JfyK0RC1t/P7+qD6Xh1XScKN/h778S5/981/SNVUXGdLZvklJ5szlsWK78ye8InpQ+qmYRwMyKqc3zx5nRaR5OW5wPJVVTO7kumvLCvuHexSFhXrVcrTJ2dkWUFeVASeSxT6VLUYD52fzaCsyRcpaEeKUChGL0AHs+c1Tt/rfO73QvF+73nyp2jgOpcC5xoYfV1uoXLmRUe4c02xr5tuSq+aDiK3cbTQmeHAB+V1Lx5Wn182nRuetattW1TkMtkd4fiaP/VHv8iXXr3Pv/ry9zHLss45DwSBSKuOLGgncuuIF+iO3Z9W8se+P4WB+zclRVnhTyJpbtKK4NYQkkrkjBbNML8vehoZb4iKpmk4P5tx50gms+PlkkW+/m5fdt/14/JijePAjd0e/+XPf5skKTg9W7PISr5xvOJilfPtt6/4gVf2WM03vHQ45PnVhn/wlSeMx+F29x5EgXgLXK8IwoAiL9jd7eE4sLxe8t7bZ2RJRrpJaeoGNT3EUSLnojZ79Y2RitkAGetU15tgc9y3Ge9+JIVwcdZNsXYKtqxzmzVvY2gtsc/62FsJnp2ULbxu4W8bGGMhdOgMetq286FvWzmv4X537m4gk7h93tEBznAKowM+9WNf5NVP3+Jf//EHzOdZx5C3u3hLGCwzea5o+EHuQV12/AB7zlYp4Mfdf27bSGxw24gBj6OE4Ng2XZRuXcr6JBqCH9I2LYvZgp3dAZ6n6Pd9ZqsPj1v+yELf/4nPU5wvYbOheD5HxxIfm/78L+D8ub9Ak+YURUM7n9PkpXjfZxnuIEQd7MHZmXztadypTPD+tMd4FOD4HvU6R8c+i6uEOsnRyiHLKzZZxbyqOC1LsqZBOzDSmiPPM2Q5cY4bmYlYJlJTmMy0nJgJ9rpqODP+803bcm0Idpa97zvO1jHvvKhEX4/s60MzlSdG+161nQ7ffg5aBzyN2esbwp6Ehdlzkgalr7Wx6XWJlCJWmsBR7EYeg57H1HPxbu9zeSyyw28+XphAmtY4+nUqfddxCByRFnY/k5/fDXw2dcvmBcnf0BgEHXia07JkXdcow/JPmoaXo4Cxq/l8P6avNUOtGWnNTugRxy5tA/2+J+uZJ6c055c0sznp6+/8cz6ivrfHDxw94O3ZMXVT42sXX3sErs+vHn+FP/PgT3OZzlnlslfztYevPfZ7O9RtzYPRPb51cc5LNw9Ik5wo8NndG+M4Drdu7In6w3eJ4oCqqinyksB3yYqSJM3JrzZUSQFa4YZSnHuTHsE03vrZO0O/m6yVQ2s94CNXiHSN2avbXPhAmwJYSxCOtZO1EH/RyMRct0Z+Zn7F7Z7eJtBZHbxtAFo6e13ojHFsAR6Y5mIvAu3g7fbwegF+6OEGHv04Is9L9qdj/l/m3uxJluy+7/vkyT2z9t5v913n3pkBBoNlAGJAkABJwQBkmiZlkHSEwrLD4bDDYT/IDluhF7/A/4Ae7FB4eXGE7HCELUuiKNlcIBIiCIIECYAABrPfbe7Se9deuZ9MP5xzKnskzjhIWcRkREf3ra7uyqrOW7/f7/v7Ls8MR2RSshWOePvek3Z/7ojWyMfs6kE3KvqxTdMR2Jf28VVLwLNQuvrTlOIiUfftuOTjhOhKH9dx6FzpY29GuIGH0/EJfJVC6Do2QeDjug6Pj8+5SFNiz2NqoNIP8PHZT10F1MsUxx5h6BLHLv/dNx/wt3/mGc5mGbZt8eRixc3bW0Se4Kef3yIIbLb7AWdnK65eH9HUDb5vE0QqGCfqRghhrdcCySKh0+/QH/Wpa9WoViePIVkoiV2oip2Iu6yjXf24Lbxm7x73FXR9OZGuKlqXPENYy5Yt2Q1aQl6Zsw6xMSx1aHf5plDnmhdgoG8/fjc07/ptURa23ttbapI2GnfDjt+8Cr1NGikhiLm53+PkZMkgVBGwa2KcaUA8nT1vbHplpSSIJrnPEspBz6w4mka7/UXqtenvKFhe2Gpyt13q+VgV/VJzA8xO3w1wuj28Xm8dCRz3YrKsZDxOmU4zqkv14V883n/en0woZynV8Zj4Y9eRSU6dFDiDGCZnlOcLwkGIvPcQb2+odPT7+wS/9CWsFz8OvR7WaEjnsx+mOBpjOTZymfPwnQXZ/RMsz2F5tqIsao5PErxRrJouW1A2DQ4W40pyVkreyUv+dJWR1AqWvxV4aEUuk0oV7qSu2dRTatU0bGut+Kbu7CNbrPPmpxrOnlaSeSXV5K2n70AYBjxrQt1K7+ONDa45TPqnccZzLn3b7NE7tmDk2MR6eh+4NiPPpWMLuo5NoSe7jY0A69nnCEOHZrnEsxQSEdlqJWDke1nd0NPrB6PjNw5/jgXjqlqT+xJZr5sex1JkwFAo+d5cqun+uCg5KysOi5KybsjrmvOq5Go/oNtxqeuGjY2AoqiZTXNE6GLZAtGNcQfx+15CP+4jdkNk05DJnJ7fwREOszyhrEuKOmeWLdiJe/zo/E0cYfNofoRve/zs1c+wEYz42evPsxVFfOT2Vc7Op1hChX7ce3jIO+djRbqbJ8yWCStN9BLCIgr91r1uWSrnu2nB6t6Y/HQJvsDtBjTGW94Us0Whpl5z9LyW+W609Xoqbsyeez2NWy00L1D7bmgd9cx+3rLe7YDn2q1DXkP782YKF7R8nVJieTa2LQgDD9dRXyv7Xwff9/jo9nPYlsVFNmEyX2min9XK/C4fvm5UzGvV9VhH7JpwH0PSM5D+qoTtsF0F6N+dLBJyrWiQq4JymeF7Dv1uTBj4dGNVTE4vpsSdiKKosC2L4lJw1gf1yEvJYBBwMkm4uEgQwqIsaxaaae05gp2dDncfTuh3A944nNPxbP7apw/40oc2eP72BvN5zvbekPksxQ+U4mR2dMbJyRLbFszOZwhbsJqv1kXesqx10Arzc5ieKJe84wfq9t5mW+jNPhzUfc0+e3mhpn1LqEJrirwh3slKFUCz3xa2Ste7vJM3zHSDCnihun88aJn6huRm+AGXf8YNVIF1/daSNuqrnbcXtlJBQDgOju/x8q0BrmtzOC8pJxftGsLxW8Jc2FXPw2j1F+P2Mc35XE6tEzZrX/xsqdYfVbF2GyRbKsSkSLGjWL1uZQZRDyGEQlY6A7qjPuPj8Zp/sbkZqb/VexzvX+hdl/5/8zdxNrogBN5un+j5PdJ7JzS/9U/xr28Tf+WL2JtD6iQjef0Qlkua+Yz6G1+HOIYooj4fYwcu8zeOcLe67GyH3PvhCXVe4vs217/wHDdfvk6dVwz6Hr2ui4WafJ+PAvY9lzuhx55n4+kp+YerjMd5xXkpueq7dG2lKy+bhgdZSc8WDGybri3W++9IqMI3cGw8C216I7TGXKyLvDKuUUW21u8xrp5+z0rlkw+quLt6sq5plLTtz3gZq0YVVwdLkZeFRRQ5a0/6smlwHMHOi1dokhWe1hL7QkXKqvdYiw1XNS6epfLnl5oIWDQNF6VKuAN4lFfrlcSmazPTsP0N3yXRhd6cd9Y07HkeUvMIjP+/hcWjWcrD84Q0rQhjl95GxOhqn+SNIxrNYajz95Z0fBCOVZnyH7/4K4zTGYnurG8P93k4O+aPT75Nz+/w8t7H2AiHFLLi1fNDHs6esixXfPPpt4ncgAY4WyUEgc/J0QVhFLC7NeTpkzMsy8L3HH7qkx9id3dEXdfEYaA88XsewhV4G5EKXdkOlV+7Tr4rDxfrCTvY6yEiDc+DKvxdDdkbODvU7nlVreRpltXuvo3e3BVtt2kscA10Dy0J7/J7Qi7bqd7i3d2qaRBANR1CYFkWtlC+/5n++xdlRRwGPP/sVWQjCRyHqpaM+h36V4bt7w0uMYxdoTgEpunIKoVcBHbrYw+qEUm0IVDfb5/vGtKvFXqRSUTXI80Krem3WM0TDo8vWK5SNRgGHtcOdjg9vsB1bbq+/75vkB+k47/+8h2qqub0ySknxwsOdrvcezDh/3z1iF7k8W98ZIe9vS6uI/ju9x7zmz885mhe8A//9ISdQUgYuqxWBVVZMRvPCeMQy/cZn8+xLIveqMf121fW077ruTRNo/bDwob+pipevW1VsLsbKpXt/B0FR1eFsrGNegpyThcKcu/vtBr1oMPa6z1btrt7Y5Nb5u2O3ITRNHWrUzcGNAYVMI1FU7foQNO0bHZzGO99I9GrJeQJwnUhjNVngGxF1I24cn2Hi1WF6woiV+BvbOlJPlK7eDdgbdVr6+x6YavbjBbeEOvMhG/88s05mzXD5UAfgzD4kUocNOuHyRHFdEJ1cQIN5GnOxt4Gkwu1Ds+yqpVQ/hnH+xb6erGi+d3fgjCkOpshPvESYnuT7meeo8kLrF/8FRhuQBBgf/YniX7uJVXcDw+R84z5P/g6WBZykakQkL0+lmuTJBUf+tKz2IFLdGubw99/m/ThGaePZ8zmBZVsuB76zKXkqCgRFoxL1X0/KUqWsmbLtdl2lcZ+XkkeZAU3fHetE+/YKqWtQenhTwvJg6xap2p6QtCzLc61GiDTEHhXQ/aFZsnXKNOdQMvXzArUTPCVZvc7lrUm6G26Yr0vH7kKBo9tQc9VjYpjWywWBV1bEd66gcPWplpxUJa4XcWs7MbumhQoUCsK1Xg06yn9opScFJINV6y/71uWZvArMqGF4gs8KUo8C97JC0Kh7HUDSyEBnm6CTsqSlazX8b8dW9Dvedy7O+X1V89oCkn8kQPyJ2OaJMU72Hy/S+jHftRNw+89+Rab0QDbEkROQCErbvb3OEku+OvP/gqb4Sa+4/Li5of48s1PcGtwlXmxwBaCrz14ha2oy1I7rXW6EZ7ncHw24a986gUi32Nja8Cf/OBttcOfLVkmGWVZ0d/sUecVxVQRcZp5obrGo5Uq0KGtipWwVCTreaq08RZQ1SqC1ujLDYQ9zlRz4Gm43kD5smkLoAnBuWx3K3g3A99A9YbdLjS0blktAmAewxWq4Wig148JfQ/Pc6kqiePYyhvDdYijgI1IyRSbpiFwfEabfebLVSu/M5I8Y6YDap+eS1XYjaWvY7XrBUOyix0414E/J0nL5K8bgn6E6PvUF5lSOuRawQA4rkO/F3N8PubNu49Jk4yDazucnkw4XCzYDD/YXhCgruP//btH3DkY8JGXbtLp+lSy5qde2ufuyYq/+ZM3uD7wcYTgK5/Y4b/66x/n4zdHvH20oKxqfuePHq1ldUIIPN9D6LXOyz9xnY2NkMGow9nxFNd3yZKMMs1wPVeZ49gOLDVRbnmhitf4UDcAO2oPD6oILicKsjbJckXaTqzpXBVj8ztyHQbj6K8N2a5p1JRsJGhlrol7OnP+clgMtHwAs882zZth0hvin5nE4x7WYAsAx3GoC72+GwxxXAfXtZmmFbatzLHifqzOsanVlG327FXexs9mCz3NR62SwNj5Clvn0Nfq9aq0BHB63DYoQRe7v6HNgxbqo8zbGN6mxh5uK0Le+JxkkRB1IhaLnCyruLbz3tfx+xZ6MexDFCFPL1jcPaH8/T+EJCH54UOqaQInh4jP/jzyyRGsllg//QXEl36Z6vgCZ6vPYllCp4O7N1IFwRbYkcfmXofx9x9TTtQL1B+G3H8w53CcMVuWjOc5T7OcLVeZ2sw18W7o2NzwPQSKDKd06ybRzWZcSfK6WUe+hjoA5opm5fvCWhdOQ6IbOIJEs9ULrUM3cHiif5ewLBb653xN5DO6eTVRQ6wf07w/DbTZTd+2GTmOMuApKiaV5DgpOCsrLqpq7eDnH4ywP3QHZjOQDendE6bLgk3XWbP9B456PpFuRox9b42S30X6nGpgQ0fZJrJGaMSiYws2XYcDzyAmNrZlcT/LWckaTygU5ENRwDXfo2rgrKyQdcO1a10+/PEd7I5PcTonfOlZrF4Xtrbe7xL6sR+ykdhC8I1Hr3B/ek4uC1ZlyoPZEVmVk8qUZ3rPUsqKpmn43N7n+Pzez3CeTvBtD1nX9P0u2xt9ru9uIoTF5rDHjYMdvnf3HSazJa7r4HkODx8dMz6eUJYVeVEyO5oS9yJFvNNFTfQ8vGt9VYxl00rOmkYR8OZFS0oDVfjSSqECnt3uuYVoC3rstrttU7yNZe3lYm9CcsxhXyLoCV3sBa0jX2irNUHo0O1FOJ5Dlhes0pzJxZzZeE6+yNbWm/tXt/nZay9wno6xLIs/OXyL46ML9rZG60bhX3LTE2hzHakjbTVJz9X8AosW1jeNjSuw97vYsbdWI2RP59S5Ju71PJwrXaytAFydd9807GwMefaZA8IoYDZd8tyNKxz0enxm76V/7dfhv+oxnud4juCtxxPG4xTbFrx9f8x3XzthvMypm4ZPH2yQZCWztOLLz2zzn798Hce22OopM6c49uh2fTa31b47ijy2rmzx+tsXHB8bQzPJyeMTqvEZju9TzOcUZ0e4g9GlouyBc4ntXsvWJc6oO4z0rWnavbvjqYlf2DrxzYTBaOMYw6SvL0H5lwlpZc46KMcUbNdvf36tSdc7fNtRP2s7qrh2N5XpTWcEyUJ5xk9OqVZLyJTvfFmURJ2QL376KkVV43k2X3/9lNVs9W7HvHioCrsp5MYb3xAEbad9XQw739x3eaGage4mbF5Xn7UFsDx555Lt7qBdDQBUBbKSRHv7eKNNgiggSzKGw5CdnQ6fudl/z+vHec/vAPV0juWucL74RYadb1JNV7C5idM/x/3UizR336RezLCERfq1PyAEGtfDefYmLJfK/e7kBHo9mpNT7NCjmiY0ssZ1BU0pefrdxyyWJVXdcOtKh0qq1LRypoqua1mspGLBf3+ZseEqZv24ath2Bb6wWMhmXfCEpaZyA2Mf6/13oT3vjUTtqu+uY2U9/ean1oANgRAYHm7nUlCN2YEvZIOD8tM34TSXj54j6AglcbMtxcJPamV0k9Q1Pb1SWMqaTsfFsiwsW9A8eoR17RoyK7Asixc+tMm3f3TKpuuwlAXnZWvDm2nCX4NqMmrUKsLT52jkdoY8aFsWW47NvbRgy1WxtEXdEHoWe57LO3kBsqZrCxZS0ndsVlIykzX1+QpvnFA38MJ+j+2P7MFyCYOB+vwBPsbpjNiN+A9e+AX+4PBPsC1B7IY0Tc3V3h5Plo95tHgH13b4xtNvc9B9yEHnCs8Pn+E8GyMsi6Zp2O10uHcxxnFsltoyuNKrl3t3n7BKc8pKcv2ZKwCkac55VVFWEupaFau0os4khSGaZVIVUhNJa1jttZrOm2WpdtbLFcU8a+H5rIJZjrfToWhoGeaGwW6IbknVwvZmx25Z6vvm4nHtdm9v3pRBTc+Whe3YyEKx1WUtqfJ63RwI36FeFPiey972CClrni5O+ejW87wuHlHVNT/ziQ/xj77+x/Q6EXPDojdGOMKQYBrwHdbhO47VNiyS1p/fstTrdZoijb+/srVEbIcKEdGIQVXUa0Y+Zc3keAqezfHjM4a7A25eVx7xL+08z6pK/rIux7/wcf/uGYOuz1dePuAf/NETPM9mtSrZGIV8+pkRv/bmKbtdl9Wq5H/512ehygAAIABJREFUnQecvFwyCB1evNrnaJaTp+qa7XRcLi5SsGC1yhU0D7iuzfhiiawktm3j7uwpC1wLKktQLhYKYp4cqqI0PdG2rpr17seKqJfO1eTqBjqX3Wtldoa4ZpzlQBXDzQ1lf3yZtFaX2v71UsE0O2/Dnjc7bUPkMy58rt/K9uKBWj3UEsoc2x4gywyKlDpdwMYVbNtGnk3wN3eQUtLp+Pz+D4/4a5+5yvfeqCnLmhdfusF3vj5TDc75I4UgGL29IRoaToEpzJZQ5y9LdZuR9PW21X2zpXptDEdh8ypWd0gzOVG3zU41jB+3/Ic8IckU8bBYjRjt72Dbgtu7PY4Xf0FnPHHjGtV4RXN2rCYIWdM8eQqWRf3m29TvPKZZLhBxiPBduHaL5u7byloxjhk9t6MiTR0Ha9BHJjnF6Zw8KXF8F7sXsn1rSBjaeI5gOlUs/nvjhHElOS1KurZg4Djsei5XfWed8HbdVz3Ka0nOUiozmX3PRWDxWOvHQ6HMbWrUe0ckLDZdQVY3zCqpC7fylgfY0US9pFYFT6AahaVUDUfZNGtGva0ne2EpH/2BI9jQU7xvqVjbhZSclxVPilK/Hzn4GirfiDzyRjkF7l3t4e4MKU/n1Hfv0/2rL9P5zHMcPpmzHXq4mtDnWWjUQf19zkp1nh1brFFO9dyUcx/6ufccwaySa93+Qtbsex4j12ZSqfWIOXeAi6rmuKjY9VwOPJe+Y7Mf+zyzEbH94V31xhxFUJYqxfADfHx06zklraoL5vmS0A1oaEirnHfmT7k/f2dtkWthsRft8P2zV1XymRtxezRir7NFLiWbnRgpax4/OgELwtAn7oRsbPTXEPb5eMZ8mXD66Jw6KSnmGZ1+DEMfaz9WU7thsm8F6uuLjLWH/XbYMuHNbty13+0jr3f8xTLHdu1L1rX6d1io+5t9eNW0UKZh7kNrcmP2+r6tzq/jai/8BrnIIa1YLTN8z6M/7IJrr5n2OILdrSGe73JjOADgzcl9vnD90zy/cYXvvPWQO88c0OtEWJHDencGl1YNjnYE1OdqOAOybs10ui6MM2xP8xRWpQq6GfmwLFWRjxylCjCowaJQ7oF9Dzv2cDyH3laPWzf2cF2H3U4H2Ui+9fR7f1mX41/4+Pf+rQ+TFhV51SiOT+AShg6vvnrEa08XvH44Z55JikIyHAbsdFy+83BGP7D59PUuo60+N7Y71HVDTzsqTk4nOhnNotfzVVNXSeRqQbZMWJ2cUh3eb+Fpo5OP+poxrz3cOyNVXJfjVrIW9mCw1xLNjJ58eryGqddM+sWkNa1xA1XYTMZ7XbUNqGkUQs0BME1pmbcufUafbwpk07RyPkCmKQQx4dVba7Kg6ysXvMHWAMd1yPOKIHD42isn/PJPXefmbpfDwzk4Lk7ga7MfjTwYEp6B1x1PvV6gbjNEQdMQ+LGa6KFtruOh+rnpCc30TDcFGunIVzo1T2jCYYDT7eNs7bN76ypRHNDpeISe4Ftvnr3n9fO+hT75ve8xv3cG0ynV2YxyuiK9d4p3Y5f8cILY3lRMe8dBLlKar/0GxeNT5Nv3aZ4eUp4tSL7+HXBd6vMx3naPRtbUdUNdVtSrnKO7yov54KCD7VgslyUdW5HFLirJw6zgtKw4LkqdiKng9FVdM64UA/9OqMgOhuG+6aqiFQg18YPKgD8v6/VUPK3k2lpWoqD2SSXXMHeDen9RGnb1epiJXhVetbcfOOrxBo5DWtccFhUnZUVe1+tVwNO8ZCEl46riiufRtQW5RhrOz1PsyKM6nWL3Qh7/wQMALn7vNXzPVmzbQMkKhSYL1po/4FrGg1/t7fNarTMMP8Dk3pu1Qt00bLn2+g9fNxALwbOhz0yq8x06NndCdY4nZUXWKOi/KNXfLX8ypqkkFAVUFeFnXvz/eo/6sR5ff/SnPJ6fc5KcIpuad2ZHTLMFW9GIWbYkdkOFqGCxKhO+8fRP+NPjdzhanXCRTiik5JWzt/j4zjXOFku6vZi6bqhKiR8ov/uzsylxGHB1bxPbVs54wsjmMsnyeA7jjOZMT+WFhtDH+Zp0Fu31oKzxPXcNTzsdX8HiZio3UPtK69ALqTT5Dao4Ru2KgNhtnepsS039hlEPqnnwxPqx1kW+qtV0lVSquTdpcRcZ2XjFbDxne2vQGnTkkvPJnKqSPJrNsIXN1x++RSYzvnf8DkJrsLe2BtjCbm1yjXQOVEFPK/W6rB33dOH37LYh6LjU60kQJUnU8jpnK26Z+ha42x0IbOUemGsuQS2JAp/5bMVqlZJXFYtixfX+3r/26/Bf9fiNbz/m9VePOZypNcR0nvH06ZxeP+J0krDVC9TarlbRpf/kByfcfTTleF7yW69eMD6b8aP7Y+LYYz7PiToBYRxS1zWBzijIkowgCgg3NrANOa27+e5gl3SuWOHGKAZ0MEvSwuhlht/vt0Y38UBB1W6g7uPHugg2qqhli3YarqUq8IbcZ8hul4NtssW7LWzDXmsyY7tqgvfClg9gu63JzuIcpCQ9OSIYqEZDamTu4uiCMA6xbcVk/8F3H5BVNd9/84z5ZAG2zWBzoB7boBhmheCFrROeJdrzL1J1u+ErrJsZg0Joz4Detvo67rfyPdtRJkX9nRbRyFZURaH+zllBUVREvsPZIudgq/Oe18/7Fno79PAjF4SgriR26BFc34AoUhO846iPUncrYUidFlx8/zFWr4vd8RGeo5z1tjawAh//YKQkAY5NOVkhLNjcCPC7Abs7Eb2uIqAttEXtht5RFw162lYQ/bSq2fccrvouj/NCkeVo1t7vNfAwKzjwHRKpfOI7tqXja1mb4BS1in+VTcPAUZaxWV3jW22DYOBws3+P1ol4ahVQNAoFeFqoFcO4lLyZFjzOK57mkpGrHPxiW/Aoz0nrhrOi5LyU7OxEXDyaKsnS5lC9D69W+KHD9q0Rzz7TJ8mlStkT6vE8/XqYKT+RzTrRb9tTz8GzLCJhcc13cYXFyHGY64CenqM8/S3Ue/9hoZqraSUZOA6C1mNgaDvYtKu3o+MVlqMQm/zNx8oF8QN89H0f27I4WV1w0N1hI+yz391mL94icHxyWZCUKxUcJCtuD65y0Ovxw9N7aofXNHi2y2Y4ZKvbwXFsrlzZVNr5slL2t1FAtxvS6UZsb/TxXZd6UVzyk7+0/zZFGdQOuutCxyU5W0Dskie5SmrTDWq90I53stHFTRvWmCJpyGtmSjefza7/sqTOwOXA2uKxqFuugM51B9R6wBD/Jjn01K7eDVVeuZQ1Wa786zcGXcbjOR3PY7+zTcfzGGdTBkHAp+7c4BM3r5KsMiqpmxGDKhhJn2+3PAHTdPiXiIA9D3wbq+fRTPJLTY9uYtKKap6p6V42iMhdWxUTu1iRQ1GqNMaiqriYLnAcmxuDLd4aP2Er/GCbPgFc2e0QxgFPxyt6HR/fd7h5c8iNawPlY19UCAuuHfRxXZsrw4g4dnnl8ZRu6PLhFw+wbYtO4BJFLr7v4Ic+RVZQVTUXFwndfkx30CHshARxgD8atc50xhjHwNLGuMbxWhi+zFUh9ULy5bI1kTHxs51RWwRN4TMadFP4zWdTuE0CnRe2O3jD4LddtfNf69r1hymefqwKvGHbG+vZ1QTiAdliBXVNmSpiX2+jx2KyQAjB1kAV4gud4vnsh/YYbg2Znk3bIg/t9H4Zsjd7eT/W9rlRqwQwNrrTE9aBOibe1w2ULLFI1YoEUAmDjvrdruY8LMbUsmYxVf4PN7Y7PDlb0Q3e2+ve/upXv/re37TnX/V/8pNYH/kkTtfDljnpm4fqbytr7IM9mnv3sVxFNhNbI5xbV7FOz3F7Ae4vfwVnfESTplijERQF1fkcsoKqrPH6IaFjKbZ53XBysgIsLpKSgePgWErmVTZqir8oa/qOilIdOTYTWZM3tSLg2TYLKYltG4Ga0AI9+XdtwVTD2WWjIPyyUfB7jSrAsW1Ta33RvKpxhbWOn11qW9uqgaXee9uW0tYbC/FxqSbfpWzImmZtxBPaFnOpGpSBI5ANdG2b2LYJhCAUgl7PI3rugGa5Iup4FI/PsGMfEXhk4wRZ1VzpBNxfZpSNWltGOognsgVloxofV5Px0rphx7O19a567j2dztegpv9VXWNbynDH+A6MK9Vg2RbsuA7PbMfM0xJXCDqxSy0b+j2f7o0NrI0Rzt4mZBnii7/63/753rb+8o5hGH71Uzsv8rGtFxECXNvmeHVO7IakVcZOvMUfH73CZjSkqEu2oxH7nR2OV2fsxZu8tP0CkopxOgVLsigKzscz6rpROmxNBm0aFHQ/mVNWkryROLFPEwhs36Uxe+a0UoXbEYiBT7Mo18x6J/apK4nTC6gFCMemcS019XdcHekqWl29KXig/O59G7BaRn3TtA2BefyyVtO9KfhGm2+6PmGpIl9qkp4n1GSfSUgldtenLircUNnP2rFHXlTs7mxwdWuELWo2w5hH83M6nkc/iJjnGY+Oz9nZHnH+znmrpZdNC9mXjbbARev8UTwBYw+cSaVCSPUUv5709XOrUc+ramjSisYGNw64vr/NZL7CsW0C31UrBCw+9+KzXO1tcaN/hWWZ8BPbL39gr2GAyLe/+m9+fI9f/PAenY7NeVLx+HDBxiCgqFTy3A/ujbmyGWMJi27ocmu3y+E4oRe5fPLGgAqLowtFgG4amE1X+IGSF7quTZoU5GlOLWvKosRxHMrVSpnAlIWCmIuUdTpcUyv4OujoRlK0RXk1U7cFHU3e8yFfqinX8dvde5GqZsDY1XY3W2IetOEw+Yq1L7xx4jN+8o7bNhsmGIdGy9ystknxQnUOQQeqEooMrz+gwYIgoq5qhttDtrZiJrMcP/TJ6po8r7iy3WG6KFnOlogwpkkW6rEvoRjrVYGRARr2fdRrC71JvKurFvrPV6oJMjp7Q140cL/t0t3dobg4A8dD9IY0tcT1PG7d3mbQDXj+SpdVIfnFF7b/zOv4fSf6/O//I5pv/nOa//v/orn7NtZHPkrnf/ifsG4/w/L+Gczn1KWkXqXK496yqF5XTmnVPKX6x79G9uic8lRpF+V4hmULnEFEVdacP54jIo/5NOP4cEGaSt44W5LUNadluS7y00oVpTuhx4bjcFpIjsuKXVftvBM9lUdCkGo0IK/rtZPnuJJrCL5jq528geNHriq4u76Cx13LYktrzyfaJa9smjWaUNRqX7+QqsEwjPdVXeNYcF3DYHmjJu+FtvAttLmNZcFZWeI7gklV8XScspgX5A+OsTwXuxtw95VTghvbFKdzvMBhdyfieJZx4LlEtlpPCBTCcVqo9cNV36HW+/sdbU1aQxuKo1UDI8ehY9tsOCriFlhLDAdO+7oIy+JHxwtiIfAdQS0boshheHNEcTyD6ZT68VP4gMfU/sO7v8F3zr7H1x7/Dm9P73PQ2ec/eeE/4mrngCeLcwQWru2SlBldL0JYgh+cvYlsGg6XZ3zv9EccLc9YlilZVbHKC1zXwXFsxuM5eV7ieQ5FXnJ8Oqaua2bjuTLJuUhoMonMy9bW1vizzwrq0xR7FGpZWaMmXtemWuVQ1lRZ0cLd86Kdzl2hJt5cwfFW38NyBTsbQxxPv0Ea//hc6nCGS7tvA50bxMFI8QxRbuCvJ2UFsdf6sXSFtizyScKgG1NcJCxWCePxnLeeniDrmmHQ40fHJzy3cYNZlpCUJbduXuH0fIq3pYlFtkY4qkYV76aBgaeeW6VRi8sxvcJS03zTqNer47bogNl1mrQ+TTpsmoYHbz9Vxd2yVOEqK/b3N3nt6IRJNud4dc5O9MFWjgD8nd9+m//tu4f8j3/yiP/j20/5/J0hf/8/+0k+dKXHdJpxcr6iqpSBzlYvQNYNv/vdp1RVzav3xzy8SDmdKb9833ewbbH2D5iNF0wnK/oDtcZazVdk8wWrszPtuX7catKNvt2Y0BiDm2igC13dMsYtoWB9qYOazB7bpLqBgsHThTKz6W+radt4xxvP+2zRQuTrtLiqNcUxunrLamNjQUn8DPJgHrdpYHmB0x+B7VDMpoRxSLOak58ds5qvOD1dkSQFvu/w2itPGQwCziYp49MJ3WEXuZi2ITyXWfdGHmgUAvmqJdHJUofg6P9DjgfDvRYlMa5+BiUx/AVttrO4/5ZqGPwIIQS267KxO6Isax6cLHjreMlB33/P6+d9C7336Y/ClSs0kyn4PvKb36T46t+m+PYP6N7egbrG+fznsD/5EtYv/w3kkyPqJMffG1CnBU0hldf9x56DssS+tk81T2myit7BgH7fQ65ylsuSR5OUpKjY9lxuRwH7nrd2exs4Qk/fDY/zUu2ggeOyIm8adlybuVRJdhLW+nTbUtG2Rhcf6QI5qZTHvS+sNddnXkq2PFeR8PSk7wuLlVT2tsJi7Z73VlqoAmEpC1xhWSqCu1ZmPVuuQhXS2uTdq6J/VlaUtbLCfSfJuep7WBYMhz5ymcH+Pum9U0ajALlUu3u7G+CNYp496CJRzYKnd/JVoyZ6Iw1U4TjqOSW1Ss3zzE4WtWqYVhXTSnJWVgwcm1i/Psa7P5EN40ryVpohgVeSjLSUpFlFGDpc3L9AhApCEqEP7nvDRR+E487wGiN/yDid0fVijpMT/snDX+cPDr/D1d4Wsqn5pWe+xM3+NX719i9zno6p6oqBhuaWpdozfmjjFh3P45nRkCTJkFLy3PPX12+YqyRjleQkaU7ciwg2YrytGL+rpT8dp7WYPcvUtOoJ5Kpovd/Hmdo7S13Mi1oVf8O2r5sWzl5Wa6/5JpM0VcNilbCzOSAYRGvvfALn3c55ltV6wZsAG6k/68aBs7RNiasbdZs27CnPVlA3WB2X07Mp/f0hju2wtTVgtUrp+THn6YTAcyllRdfz2YwituOYj7xwE1czud9lxGNCdJJLSILJtPcEQSdUiIKx0U0rtU5IKxX5Gzkte7/S55tJqsMF+DbzxxNsWzkWxlHAZLwgioK1AkPWH+yoZQDPc9gbhrz+aMpyWfDf//qb/PLf/RZ/fO+C3e0OBztd/tYvPMvLt0b8lz91g5NpymgUEgYO3a7H6Swjyyo+cXuLxSKn2/VoauV8d+vODp1uyHyesZgq2ZkThhBE2J0eYuugZbyb4ht2VOEy02w61/70lZbhVaqIRX1Itd+78bI3vvWW1frCl7nS3+tmweqN1H0NBG8iaM3PmL19Om+LrpmaDRlv/FQ1DZdlbXrSri6OoWlwOj1Wx0dE2zt4mzsMNvrkWcn+lR6WZRFEAXWtDM36G302tnoMr19riXXQEgBNal5Tq0nfrBcMmdAoEHzNVzh/3Fr6eqFm4etzNcRDWamcADeA0wfQ1FR5TtgJWc0TikISeDahb1MaRvafcbxvoa9+8DrytTexBn2QUnnTd3zcjQ52T72BNW++Tvabv0fzh7+r0ufunzN/8xjLbV2Jih++pfb4RYEd+xw+mlFO1UnmuaQoajrC5rSsOC1KHiSZltXVFBqSWUllhbvrOfhCacAFak8tUaQyVxcsUP/vU1mriVQ3C6F+Y+k5NjZoJzzF6veFhecKtkKPSAg2nJbJXjZKk694VA17nkPRNJyUynrXkOBq1P2yuqGrkYOLql6n7E2qGl+ItQ7eF4Jh4DKZ5Fw8XcBySZlX7H/pRexOSJ2WBNc3qVcFQmviPUtN25fjaUFN96ZBAa0ysAXjqiKRKoI30i6Bm66z9t9v9O3CstjzHK7oD8+yKOqabVex0R1HsFqVBL5NNUuoZwvqNCf71vff9w3qx328efGA1y7uMgx69Lwek2yGI2x24hHDoIdsJCfJCf/snW/x1vQNTlZj7o7HHC6nDIMem+GAtMp5OHvKwI9ZlSWu63B0MibPFCkmTXPqukYIiyIrWaU52SyhyEvylTZvKXRxlQ3ioKMKWqgZ8z1PFb+ep4h2wsIyu2rZIIwtrCGnFbVqDAxxxLfZ2R4y6CkOwajfUQx3E2wDrOEts9s2krsadR/jyCfrNnjGFazz6iO3JQO6QjHudZ32XJs0yVguU+5OjpnlGf/Oc58klwU1DVtRl+PlEilrVsus9Q0Q7e9b/2czDYYn1Hl4gmy8au/vCYgc7A1thVpoNMBI8gYehLYKvOl5UDeIjYCyrJgvlE1xtxdRFCXLMsW2bP7ej77+l3xV/vmPe3fPuXe8YHcU8dyNIVla0jQN/cgj8h18V/C7d6d87ZVjfuveKfNlzuuvHvLgoWpy6qahLGvunygnNambztVcIQGua9PUDcIWCCGopmPIEuR8ovzXDenNTKhFpmD2NSSdtEQzM3VfDp3J03bX3NtqC58f6eKr9utWR4W5xL0Yf2tXQfyWUPt1aINwDDHPDdT3jHGOcdMzCXNm6jc7f2N9W2bqsSwwUbeykswnCyanE2TdIGXNr37xWR49mjKdZgwGIVVVk67S1tTHPD9T7JumXUsYCZ6xB4Z2oje+AlWuvjZNk3kOYU99Px5oSexKrU7m5+vf1Rt2kLImySqSrOLv/M+/957Xz/sWesuzaWRDce8pVBXOKKY4mSE+9jEsx4Esw/q5L6v7BgH1qqB/Y4NX70+R8xSZqLxob28Iu7tKl9sNOXhuE+HYWJZFXtQEgY0rLPq2zbbnsue5LGu5Ztmfl0oKFwiL+1nJSSEZV5JI797HpVznwEvNdG+AVa1g80lVr5Ppho7NUqrJ3+zIY9+mH6uQCM8TXB+G+EIl4MW2RddWK4RaT9SBJsUVtYqAXcqaji04L5Vnfq417LVGGyZVze3Qo2oallKq2wNlO5vkkqOLlChyqO4/YfjpZ0h++AC5TMlWBcXxjNU8Yzor+JnbG7rhUBeLY6nY233fWdvvGkjf03LAsd7PqxWI5ElRsZJSR+rWSO1VEAiLvmMzlZLTstKSRfX6dAJbE9PUdeH0Qqppwjd+7RXq7INtgdvxYt6ZnZGUGWVdErkB43TGte4Bge0zyea8MPoIpZSkMiOrcm6PRrzy5IhVmShSonDZioYEjo+sa7q9mGsH21RVRVVJqlLieS6OLXA8h9Ggi9sJcD23/Q+e6mm5gfpopYp/rovrRaYm9KrW+nXUTl9YsCqpC6kagI4OdTE6/AZwbbqxkvh14pDNzT6B77GzMWzXBI7+b26KvXvp3wL9u5p2h2/c+MzhKve9YBSr37UoyYuSbrdN4HpyfEHTNNybTNiOevzW/R+SlBlPFwuezCckWcFkPOfnfu6ldq1g6XOLHPVhi3/Ztc8w/wO7zQSY5MikUD9v2Pqxkt1FHeWBX5xpj/1MUlc1nTik2wmpNEnP81yquuZ//e43CZz3tRP5QBxXrw149HTOeJmzykpu3FSk5uf2uviu4Gye8V989gZS1gqgqWqeubPNk7tPmM9zskIq9LDjMewHrFYFnu/R3+izXBYkiYKIo04EFjj9EXbcVcUwiNuIVlPMVxNFIDPe7UG3vYagdYuD1uHOGMoYGB5aMxxNYPMCD4IIP/QJogDvyg1VsAd7qnAa29vL+fJBRz2ecdEzED6oczZhNPrx/ZEiBTaTE2Vh3e8jK4mwBYvHj3B9l0ePpsSxy6///gNsW5CmJWlakqwUhyHcvfJuQyADuZuVhOOrr2cnraLgMvO/lu9uUAzhMexpKZ3fEgihbVIG2/hRSFmU1HXDaBTRNA3f+P232bu5/57Xz/uz7j/6Anbo4gwiVdQ3NvBv75P809+lKUrkdAHjM/zrWzRJQnhHZcrf2e9SjFf4dw4QoUtxNAEpIU2xTSBK7OH7SooUBDZx7JDr6X3QUW9QVdOw73vcCjwkapfsWHDFd5QXPcosZqBtXjsaRvUti0g7wVnANd/luFDRq74Q9PTtrmWx2/Xo9X26XY+t6wP29rv0ex4/cX3AwHHYcm3lqqdZ6J4AG+g5FtueUB73trWGzpVxj4LqHcsib5T8bS6lNh1TRfcwLehEDkVTIxt45eFUwfeeh+XaOL/4SypR6ic+QtTxuPbSPmdnKRLlxOdZMHQERaNscFVBUh/GayDSzoAjxyGpG0aOsg12TVOg/1NOKsl5qeD8ji0Y6hS761q2eLQqFFLl22ujo9WTCT/9peeIvvzZ97uEfuzHC5t32IxiLMsiqzKG/gDZ1Hzz6Xc4TyckVcpJcswgiHln/phbgwOOl0v2Rn0WRcL13gGykZynU3JZIhsFd4ZRgOM4uK7ad3quCk8RwiLPCzpxgNRvaN3tnnJp84SaQC1LFWFTcE3Ge4MqdqYwa+285Qpt/iJbi1s97VuOcuoLAo9uL2Jvc8jV6zvcemafT770LMF+b22zu56cy7rNga+atvgL/bmh3d+7yj8DUCz7RDV2lZQsVindOCTNFbJxfD5hOlnw9kTpeb9w7XPYlsVPHbyA69p8/sN3OHx61j4/Y4Vr0A6DUDSaSFjWa86BMB73vg2bgUI2jAmQJ1TS37IkOZ6vvQaGByOizQ5+4DE7U2+YUeCT5yX73S73JxN+/sMv8FdvfeJf5yX4/8vheQ6bmxHD2ON0nOC6gl7P5xuvnvD4dIlrC350OsXzHL59f8LBTpckqdjc3ybLKl6+vUG363M8SZktchxH4LgOnY5aH9q2oCxKbMdWE32eI8sSz9fws+sr+1UTSuP4rd7bslShy1eq6BbagOhf9LEHNZWa+3mh+n1NA15EvDGkzEu1g3ZsvMBT0bpbu9AdqfsbDb+Btg1r3fx+I28zEL5ZIRhHvWxJPtVEQUetL/LxOY7nUC5XOMMt0tkcxxHYtqDIK/7dLzyDbQuev71BkRXc+fAB6VI3EnXV7tQdT7vlOa2/vbHrDS6Ff3U21Ll1Rq1JkIH8zfOqijUSEVx/VocHhbCYUOSqSSvykiyrmE4zPvTiAR/78PZ7Xj/vW+jJMsQnP4nl2jRFyfIbP6Sezhk/nmGFgTLTef1HZPdP4NEjkreOsRybnS9+FKfjk732DnVaKt31ZAL7+1SzlEbWWLYGL6mcAAAgAElEQVRQ8jth8egk4e4kwcLCswVv6gQqASSy1pauDtd8lyuew7SSLKVymZtU9ZpwVjZqny5RAS1q1ang7G3XxheCpZTYlqVjaAV5XpMkJa5OyXK6gboGHEFfW8R6uojGtkCgCvh5qeJj59qZJtZF3uTPO5bFrudQ6F15o/fprqVseG8NQ04XBbf3u1zbibg6DJFpgRzPcLohze/+NsNnd6jvPcCOfZA106zS0jn150k1crDpKkmdOYeeo6G6pmHbVWqEnnbiM74CtmWtGw9jLTzS0joLixPdMQJseA6d2GW4FeNudKgryWSSY3k2yf/zB+97Cf24j7qp+Yndj9I0DY5w+MHZG2RVzlmyYBSoKM63Z3eZZivKuuJHZw/oeh6fv/YcrrB59fxtlkWOrGtsS7ARhkgpSZMMx7WJ45CyrHh6csHh0QWVlDiOw+R0tn79FouEZlnidgIVyhLYyod9mreTuYHijcxNKEiewFZon5GiGeY+YHVdPNdF1g1JonzJq7pmK47Z7MYIoRLm1gY9NapAmrAbsyc3zH0D51u0aXk9TzUDoaNManw1fddpxd7WiJPzKXdu7rO/s8HVvS2EsJhmGVe6Xb59/F2u9YY8WRxzazik6/kslylu4LG26S00muDZ7etQala9JiTiCuppvlYeCN9Zr0EIHcVD8AViFBDv9dTtuWRyPiMIPKpK4vdCbFtJI29f36NpGlZlSexGfPf4zb/8C/PPeeR5xcduDKkbWC5LTk9X2Lbg7GxFnkuWSclvvjHGti2SvOLeoymuK/jkx/fp9Ty+9eYZ5+cJi0VOkpQKvq8ky2VBp+PT7SrDnMV0QXp+juU4eGFAvtL+7EGMnF2owtTV4TamyFaF+hxp7Xxno5W2maQ6UIXK7Ny1O50h81m+T7JMVGEPfRzHxhIWru8qQxvDrHd07Kxh55uCboJiTME35jm+lrb1t1UD0ttqSXRBTLVa4g03SJcp3e0NvMCjszEkWRVcXCTs7Hb5/ddO+fCdTR48nnHl6og0LRWfwPVb5GB50YbdGORC2Oq5XiYgmthZQ2gMdNyvOU9Q5xfEay//7OQIx/OU1M4L8XxPSSBDjyQpOD2aEoUu3/n+0/e8ft630Jc/fAMsi+zBGTQNnZ//LKLXIYockBL/Cz8Jkwnhl38ahkM6f+MXaYqK5bffJP7KlxCxYu/6P/uyclLLMuq0ID1fInyXYpZSFjU7A5+b3ZBNz2GsNd1lrSRg06riYVbwZprzdlpQaBLatutQNSqdbddzlbwP41/vkNW1YrmjkAFfqKca6ZQ6C8WAr+tm/YbcyIbgzhUGz+/RORiovGYh1u53kbCINYy/6aqd/VXfXvOVAs2y79qCs1Iyl1IVZmHR0QTBsmm4Gng8nmZYFhwdJ1R6erJsQXb/DHdvRPLGU7wv/xWyRxdYrk12OGVvGLDtuhSNKtSmsTgvlT5eAj1bkMianmMrTb8mYtUogt5K1jgoIqJrKU6DcfgzTUpDQ97UrKRkM3RxHEFZ1aymKU4v4tHrZxx8/Ar2qL9OsfugHq9d3MW3fWpqZvmCj209T8eLCByHhoZnR7c4TcZ84frL7Hd2+cqzX2RVlhwvz/nc/stEGlq71tul58cq8jbNWcwTbNumKNQEsr0xYHdnRByqfbAV2NhCKKhZ58eXT+dwvFpD6NZupIqsJ/AjH2qUhEw2BBpNcQJPy920XW6nTbRrCuXjH/guspLUdU1V13x85yqjMGTY77BcZeu9NoLWM97A9QJ1cVTNu3f6nnbj0xM8jqW4Aho+72/2ODo6x7YFT4/OKStJVUmkrHn4+ATPtjldjfnU7oscLqeUdc0rZ6dcubLJ7tawbShANSJJGze7ZtMH+k3beAfopL/axNd2NNEplxC5CEsgNVFQ2ajCZLYg8NVaBWA+X9H3fd54esznrz2DZVm8dvbejmIflOP0dMWNoc/FPMP3bW5dGzDoeDiO4PHDczb7AeNlzjM7PW7v9fhb//azLBY5T44X/Iefu47n2dR1Q7fra8CkoSxKVvOVNohBuUF2Y4bXDgiigKqs1Iq2yNRHmauCOj9VOm9jEBP21pOpNdRJdYFWV3SGbUOwGL9bBw+6MDo0ZaEsd4GqrEiWKTdujNjaGxHGIazmLYIArdkOqAm6q+Nys2UrUbusU19cqMdqGuUfX0vIVjhxh2KmgnwMolEWiv8wOV+QJGpqHi8yVqsCKRsuTudQS6wwbp+77SqyobBVM2OeX1Wox0sX6mtzPibJLu4rYmPTtMl+UV/dL+or+N52qGZj7DDECkL9aysGg4DJxZKdK0MePJy03hF/xvG+hd69tgtFTviFl6lLSfmD1yGK8HuKFd688QbWnTs0ZUn6nddZ/r1/TOfLnya8tQWDEXKeEn3sFhwfK2OVjurswu0udV5qJ1WHIpfqhS0rsrqhrBskDT3bZsdzuRl4fH7U5VOdkI5QcbPHZaUJbRZzKfG1NC6vFfEs1navnqV28ystucuaBtk0dG3BZtfjYD+m3/MJQwfhO+TvnCICh2qSsLcbc2cjIhKCpTbGuUxs7GkIUliaDIe19pjf0iY5nrDYdGwcjSIUtbK9jW3Bnf0ueV2TphXLZYlcZLg7PawXXiS4tkHzox8S//tfIXj+GllWMRwqXadxDvQ0BL/nOWvHPAXfq+fcoJL3lAeA8iJwLQuJWlvk+nYT4jOXkq52CLRQ64hJpiZ737Pp7vdBWNx8aR/vJz4Kvk9w64MtTep5MbNixvXePmmVc7g6IXR8ItdnUaz446Mf8OLG8yRVwh8dfp/ffPDP+ZlrL7IRDdiN9pjnS3biIZNsTuD47HZ6lKWk0w0pipK6rvF8l7KsKKVkMU9Ic2WiISuJtxHh9kPs7UhZQu/GenoXSi5WKfg6X2TqNr1zjqOAKAoUQhD7rfxtUShdfVbhxwHXr+6wsTWg040IQ5++7/Nofk7sumRlxa1ru1y7vqce83IRh3aqFqiiW+sda9W0YTtrApxL6CuiG4GN7zpEnZC9rSGVVNnlpxcz0iQnDH1e2LxJ7IWkVcovPPPTfGTrgMlixd7GQGUExK567MBevwbvMvSJ3JagF7utm6BvYzzxbc9Rxb+qoa6pVjnZMsX3vDUy0pQ1lazxfY/d7SFb20MGQcDHrx9we3CNp4tT7mx88A1zdnc7vHWWcWUjwvNsQs/m8HSF49hkScYff/cxn7jep6gkv/2H7/B3/9l9vvjyNTodj0/tDTk/T0jTkqJQ6hnfd1Rh78VMJilJUuIHyk2xyAvSuQq5aTI9sQYRzvY+bF/H2b0BG1eVpM7Ew+qJtUmWrZe9JXB8Da93hi1MbfbOWkdvxT2oG/zQJ+6rNVtTNzx9Omd7u0ORFXijzda/vpYtJG4OgyqYfTm0nw26ABB01DpIywDrugY/Uo1NUVHkhTIRKiuCOODOjSEA5+cJ/+kvPMvBXpd0la6T77BdbQoUvNvC16wPDFmwKtrYWiNB1E53ZKvWRhfUua6miLjb+gUAMlkRdSKE5gJ5ns3GVo+rV7pYlsX2znuH2rw/Ge/WMzTzOSwW2BsD5Fz90f39IdXFEjlLqV99DZoGf2+AM4yo3riPXOY03/k2/sGI/N6h2s9XFZYfqCkfqCYJx0cr5osC2xHYjsCzBNc7ylPds9QePdPM76dJTt4ocl3HFlz3PRwNg5dNQ17XnFdqD69S7GpCIdZRtR3bVsil7girBqLIJQgcdj57i/ijV3EGMd5WF7G9SZWXDK4NyDLlEd9zhPaVVyhBLBSLX9KSl7u2mogVAtrwIFM7b0WEq/joIGbHczkcp5RNw/3DJY5lMU5Lbv+/zL1ZjCz5deb3i/jHHrln1l5196X3JrvF5iJxlUhRqyVrZGgM2KMRDAvjhwEMGDDgFwMGBphnG2MY8MvAGsAYexZ7pMFYpCRLpChx6ya7ye6+vd57u27tS26xr344kZktQGxbtE11NArVVZW3KiorMs453/mWx4ZYO33qvCT/2jfQHYvoBw/I/t1XIc/RdQ2z7aCh0Wmc/5xmH581XgOe0kkbIp6OTP1eI7FzdX3piJc1gTZVXeMrHYU0S3UNp1lOR6mGoCcrh6phoDpP3yR5eI7uWNRvvQ0bG0uPhA/rcbd/E9dwqeqKnt3mPJrgmx4b3oD7k0M2/RFvTt5F13RGbo/LOObB9IBpEvCXx9/ENizq5r+iKuk7bba3pDCML2ccHwv73m0gYsMy6HVaXNmWfVmWF5RVie85TGfhStLWNtEHTmM720zWjeELruLi8JI0y5fJcChtVRQ9AyxFUZS4ro3n2nzxibs8ubbGTqfDwPVZ93vkecGzd66iFjr5Rb77ArYH+byhr6B6oyHo1bXwBc5ieZ+XhLOI7Zub2F2XNC/QNBjPAgylc3h6wUeevsloXZzavn34BjoaL5/e4xsHL2Iri1Gnxc1+nyTLoKreZ3er/1UpX734+e+T3y3O31ZCYjR08SdoInS1BuqnrElmTSpa0zzkhWR1x1HKl24+xv5sxq3+JmfxJR/ffmbJVfkwH5sDD0NpvHcyxzQVB+ch/Z7DaOTh+A637qzzjTcviLMSz7O4uIj405cOODsL+W//4gHtts3Ghk8UFUsE0/FkdZokOecnE7K0wDAMlJKp2/Vd/LWhTK5lSRFG8n58tiLUddaht7GCpxvdN2cPoSopjh5I9gnIfr7MZVJNo2ba16iLAsPzKIuSu09s8/Rz1xisdeh2HZTSiMOYnWsbLCNg30+CAymai0K/2MUb1mqqXrDWm4+L8TnO7nVw21RFgWEaTC+mFHlBGkTsXN9ie7ePruvce+cC3xfk5I9eFeSnv96n3WtTB1PZnReZvNXVSmZYV43csGk2TEcKe2vYOP95K9e+xXnavmjlG9i/ujxeoR8N1yGNU9q9Njs7XQ4P56ytedx764JnH19f/l3/uuMDC3391psSmxoEkKZYm13ZtQOq5VDGGZptweGheKDnJartSISt66I/9QTWZpe6STirLy8wn75LfDpHtWzaHQula5zPM6ZBzkVR8NY85o3zsOEHyTT6VpISlCVJtbKbTSuZIsKywtYanX2jna/qGreZageGZMBXSIOwuHese6bspz5yFU3X0QyDfBygDQcwm9F+cgeqGtc1GhmesOmDxm524S1vN/v4rJG3+Y2hTdi40xkaPEwL5mXFq5OQvJJAnXlZYiCSuasbPm+9fkF2NKG4DDFuX4PhEO8xOQft+RfoP7nN7DRko2fLvVkT1v3C1GwRSdtVirgSGZ+lyQSf1TVZJc6As1JkgAtnvIdJzmme0zckEthvSI62rqGAkWcyHDps3hqCruPeWBM1hdIhinDu7n7QJfS3flymY+Ii5q3xQ2ZZIKz7ZEJaZtiGRVlXFFXJRTzhNBqz5vm0LI+KmqiIud27StduoWs6LdNj3Rvw6Ss3yXPxT3cci7quOR/PSNKMIisYX8x4+N4J6BqmYVDlFbOjiRQmaExiCqqiamQSpeydF3C2KSY5mqZRFdWq2GfVikmflXTaHsE84pn1dTRNY7c94nA+Z6+9QZTHPLuzRV5VeL6zssZdEAAX6XWLF8SCpNcw+ZeP79ny2GkGccnhwZlE8B5PCIMEyzBIs4Ir2+vce/M9gnlEmubcHmyx3V7n7vA6HbvFs2tPcnc45KWjI67ubUhhXmj8F/C7xsrONxKfAMOxpMDX9WrCr2uZ5BfM/NOYepahNc+N7hpobVOeU12j0/IY9trs7K3jmQ43+n3KusIzHXQ0nlq7+v/fBfj/0WEonffOQ2azDNc26Ldtojjn5CRgfbMnGShKVhd1XQvr/kqPonHNu7HVYdB12d2S6S9NC/b2ugTTkCzJsF2boiiYjWeSdJdnRBeXhJcTmeo1TXbc8yaQ5f3yuXi+cn2zXIGiFzI7tyOFXtNlR2+5UqzfJ5HzumK69fFP3mLUc2l5Fo/uHzMauGRZya3Hd6WIKbPZZVcredpiJ7+Y9peT/UL76a7kaiBrh7omOT6Qx4VTivkU27HRlU572OP04JwgkIZlbc3n6nqLjTVZ2z1zpctg4InEzvHFMVDTpMFYFORFmt5iTeH3V5K/aCKPXUT+2v4q9CeeiXxuwX/wes2/M8FysIbrdIYduj2Xuq559ol1oignCmLWOg7X937Mib6azMQQRdeh3xeP88YJLT0YoymdKkrIjy5wn7kBwA//4DW8Tz8LZYn+5d+iSnK0mzcFtg9DGI/x9wZklyFlUZNmJV1nsVMHXxfSXNFA7Lam85jv4imdrpKinVQ1BTVblsWWZTa7cCHPJVWF25jIaBoEZdUQz7QlmdfRdaK0ZHPDIz0ck53NqDPRrOcPDmF9nfwypIwyfE809hqiO18oklpKJvywqnB1Ke6LJ3QVfKNzlJV4umjeNU3D1GW9sOvanOU5aV1xcSEQlO6YTC5jmM/RHn+Kch5hrndJ/uXvU0YZtqOIY9nHy6pA7o1BWS3NhUCKtKOvLHx15GcHZcWGqUirejmxb9sGhqYRVRUXDcnR0jUMNDzLoKyg13cwBi0YjYjfOUW3DLROB/Kc7OHJB9+h/paPh7NDqrqia7dwDBvXsNHQsZRJWmQUVUGQhQRZyE57HVOZ/JvXX+b5jSfxTY+99h6zNGSvvY1neJi6yf7sAsNQZGlOXUMYJQx6bbJcTDtszxYCUSFWorqp09rogNJRrrmKlI0KOv029kaL6iLB6kkaHEWF6ViUVYlu6GRJLq/BRdpcDTgGlmHQ7bXYn82YpQmu6fD85lXemx2x295knqayPrJMLNNYkf0W1rGwcsVb+M0vCv/Coa6R+KH0pSGNpgFlzfpaj9PzCVVVMZ4GmKaBrutEYcLb42N2/R2KqsBWFi+evIJnOrimSZJk6Jou+/8FUdDQZaXQargEHQvd0CmyYpUNUFQrk5wFGc9SMJQUwDoSLkR1mVBHBRoa64Muuq4zGHa4uzZi3RtyHEiYUc/qoGk63zp46yd9Wf6Nj4PzkItpwnyeYBo6vmPiuSazacTZyRRN04jTgssgxXEMirzgj/7oVYZDj7yo+PTNLm+9e8mTO53Gvrni8jLGsATCn5xNSMIEy7ZWBDyzgd0NUwqaYaGGWyvYfBGxmsUYgzXZRSdBU5jcxjCnI4humcv3KLKVjW2Rg+mQpRmj7RHvPZryzsMJVV3z5Eeu8uhwhlI643GMrmtYbclsx+utzHsW8a+Lj21/ZaYDLKNhDUum50Xkq+WiLDGw8TfWSc9PqOuaJEpkdVDXXJ5cMpkkPLndogZsQ/GH3z1gfeTjd3z59wum/AJhWBT6BaRv+9LoGM05Oe3VasNfxOe+z6q3SKUhWiADgGYK+Q5gtNbBcQyevD7km995iOuadHo+0zjj4ORHR4Z/YKHXex3odCgvJhDHVGlOeXyG6rbQXRP7N35V2POeRT2d4dzdZXPDgzSlnkyoT/dR3cYHOc8hTSnPx9RFyflFgjI0skZHP/JMFFIcB4bC1YUV3zcUQS5hMcd5TtrI2GxN5zTPmTUseqWJzMxq3O7SWjTiJSvNedawGvtNQ2C4FtZ6B9020RybKs5EFz6dYm/3sbf7TGcZXd+krWTP3lG67Mab+6Gv66R1TdfQGTRyN1PT2LQULV1vrG+FCKeaxzu6ziQt2LBMPF1xnOZsb3lcHMzZ+Mgu2Db1yy+htjeI3znGvrKG0fMwDZ1+32bTMnEbcqHZwPNp43IVNISMvGmUymYHL88B2LrOjm3i6DqermNrOoOGmHbVthr/e0FG0rzEtsWymKqGOCaYZVRp42qVJFhP3fqb37V+godr2Bi6sUw9C/KIMI/o2m1cw+bTO5+grCvyqiQvc8q6Yq+JW52lAVGjpU9L2buHeURSCGwdNg55vudgmQY7GyM0U8exLdYGHZHF6RrWwj0wKyVNbWES0zKZTQLScQQtg7JcOdaZhhLP9gWEvXDEaxj3vY5PEMWM+h32Oh0hF9Y1uq5T1hVhHrHVbtN3XS4uZuxsDGWCXnwfY/G+aR6KujGqWXjPVyLp6zS65AUxDpGo0bFI84Jut0WNuPKN+h3uPzzizq1dJknCd06+T9tqcRScYegKDQ3XNPF9h2Gv3RjyvG81UVRyHkkJeUmVFgLxL9j/cXPzDnPo2kIOVBqaoeH3fTn3riUkPUPDMBSzMMaxTEzLFA8Es4VrGJzHE7IqJ8xDbvY/3DwTgKIoaXkmo5FPXlSMg5Tzi4jdvT67V4b8ykc2KcuKPC957s4aV68N0HWdqqp5/cEljyYZa2s+p/OsURVpRFGOUoooiFCmwvEdDMuguzZAdz1sz8V0G523UhitjlyPi8JpOjLlOy2Jsw0uG0i6YcG7bTRdk48X8LrprHTuVYnuepiWhBC1Whadjo1tKnodhzyvmEwSXNfAMHSyOGEZZ7twzZs2g4blShOgafIzDFu+nsXQ25Tiv2DBN250tisSwTROsUcbmJZJHoTYrs3FyYTrd7bxPJPvvTdj2LY5OQ8ZDGSaLvKCMstw+r1VsM5ioo+mK7lhkUE0k7cFp2HhJTC/kOerNVyZAnk9eWxrIE0SUBcF8Swgz3LabVkjeJai0/NJU7lHRElBv+f8yOvng+V1aQqXl6hhr4FfNOqyht1dvF/9WerDfbSrV6iyknwSgWmy9oWnIIrQ9vZAKbRPfx7eew+KguJsgvHCT6EZiuHAxlA686ykKCqStGTbs9EQWDls9shFDUFZMjAMrtoWPaVoKynUQ8NAaRpOA517SsdAmOQC4dM0AvwVk5moqljv2JRJTno4Jr+Ykx+PUW0Pa70D3S60WmRHE7Y2PUZDV3bcTZGv6rqB6ht1UvNEurrGrm1wlpeMDANb1+kaOpuWydBQhA1fYNhE6I6LgrJh9k9nGdNZRhkkxN+9B6MR2i/9Hdynb6D//X9IMYsFhmybghTBkp8g0L0Q/SxNkvUWZLu40dMvHAPnZUlYVpLc1zRDaV3RUYq0CQCym+ahqiFN5UZr3NilfvCQtWd3ePjKsagowpDku6//2Devn8QxS6XL1TWdjtWirmuUplCaznMbT/LO9F0eH94kyEKyqiDIEj6+fYuyLuk7Hda9DZ7feJrD4BRN0zgJL/nlW5/Ati1sy0TXdQ5OLsjygqqq2NkYEScpk1koVqJNmEowDWkP2/jrbSm4jsjVHN9pwmSanbOlwDdJsgzdM+X1lpdNXC3LQJgkzdnbWWcWxRzM5zyYTomKhHVvwI3eLpv+iJbpEuY5V69tsr27JlO93RicLFj2C7h+obNvGUtzGqNlS1FXGvbAWxL6dF2n5TkC4U8CtIY+P5tH2KbJ+XjGW4+OudLZ5mPrH2O3vcGntl/gOBQS1rWNEY5jy3OQVSvt/oL1bzVcBE2Tr1e1NAGesSIIalAVldjcFzVxmgnJauEqqGlYpoHrWIRxiufaPLN+ja88+Es2Wh3+z3fexlEO42TG19579yd+Xf5ND9NUvHN/zPl5iN2sX/K8kshZz+TBOMVzTfb3pxyNIzRN43Off4z5PGVvs03PVXz+qXXOZgntts3lZczPfvwKtmPiuA62YzM5m6CUyNpa3RZoTYRrw2JXpqKKI3B8jI7s1+luQpGhb1wTiFrTBYo2HZhfUJelIANx0DDx9RVjXtOo8pwkSsjTnNPTgOPjgDgrsAwd1xUOVZqWHDw8w++1pWlQxmrX73VXa4I8XQXgdNfE615XGO2O8AwWKXkNTF6WJVanAzWkUUyWyJ49mkdCSswKHu2PCdOC33hmg2vbHX7x6Q0OjwMcz2GwtUYSNoTAuhYjnDRa8RUWWQCLCX+R2ue0VjI8w1x58/t9mead9kqHbzkYjo1h29iuTRjmdNs2f/ydfXRd43vffIvNzTZK14jiH21e9sGF3vfBsqDTAV9SkYztNZhOqSOB4cs338H8T/8zzJ6HdvUa5fkY7Qs/T3nvbYFmTo+kKGQZxlqP+vQEc+CjlI7rKjb7Dp5rMug7mKZGv21xe6fNmmkwsk0uiwKlabhKZ1aWpLWQyExNY1qWMp03RayqayZN7rutSaa8r6vGIlyjrcT8xtN1XNeg/cwe9nYf5dmUUUp+PkMzFNr2LnS76L6F0XExTSmUHbXY96+ePOd9DmLzsuJRKmqAt5OMuGHIn+YFjw98dJC0OF3jIMuWv9dpnnM2y7h+u4+ma9i7A3mO/+U/Q/vN36b609/Heuw6rU88tlw/LCTQi/MwtYUtrhDxsqqmpev4+grO9xsTodO8EOtbXVECYVmhmpt1WglzHyCrKwYDW+STnQ4X332AGnRI01KujbLEee7uB15Cf9tH124TFwkDp0tSpGRlgW+5XCZTvvLgW7x8+gYPpwf87tO/zcjt8dm950mKlHV3jQfTA6q6ZJ7N6dh+4/NgcBGPGXRbuJ500AvDGtsy0TSNva01djaHuLaFY5tMJgF6EyQSp9nq5JRGEqWSXZ+XsrfWETe8WYbn2GhKQ7cNqqxEMzSZVgHXsfB9h5+//QTXuj1u9HoUVUmQxXiGg61srnV3SIqCjXYL02wQA0NfMe0XEjaNlYY3KJYytuK8URDYivQk4OrNbQDSLBe292WI7Vr4nkN9mXI5nbO2JiTAq1trTNIZf3bwNX7txq/w1Yd/xvXuOi9sXWPe5GmTlnIudb2y510Q83RNPu+bLJL9ll73riEeBLVIYikqqrSQkJayRtdErlhVFWGUsLne5z988lPc7t8gLUuqSnwQLGXSsVt88cZjP5mL8f/FEQQZpqlz69aQSZCR5iWH751x81qfOM557dGEyTThd3/lMSxD8eTVPmlesbXV5vgiZH+S8fAiwTZ0NA22tztczBOU0tCVLiSvfhtd6UtXuuHGgM0rGxiWgeHYpPMA5fm0+22KBRtfSeNYlQ3Uv4iZXaTOJaE0YGUuOfaqkd7ZPhQZhmNjWia7VwZ4nsXGhk+SFJxeRkRRxs3dLmVZYReF598AACAASURBVFomftuXCTpPV6Q8ZYqGfTGtLxzwwqlM+5pG8fB1aTiUAaZDd3cbLJe6qlGGojg/BE2Xpju4BKDb3OcGwxbn44h/+q1H/INPXeXF/Rlrax6jtTZlUYokcDHNW+5KIleVaF5r5YdvObLaWDQ5ti9kxNk5+H101xOpXRLK81jkoBTKslCNp4DX8vjyx3Z4eq9Hp2OjlCgMbmy2ibOSR4+mP/L6+eBCv7kpu4X5nOL1d2RHr2kU7z6SfXuaotoe+f/wT6iyArwWmtKpv/4nGP/JPxB/47oG14XRSPb9ZYm2tSmdt6bhewI5uq4iyypmQc5bB3PGRcFRkuErnayueJRmZA2jvEQK5kI/n1RinqMBHaXQNY15Y/OaN/B9+T4pWQUUeUXy4AzNsbFu7WJv9zEfv0ldlNSvvwqAc2cX3TE5O495/nYfU9NQTTFdSI+LWoq9sO4Fvi9qSYLzGt/7oq45nacUNZznBfMmTU/Mc0osXW8icuH0jVMA6tMz4TV858/Qnnoeul3Kk4sl2Sava3xdXyZ+ZrX8/nmDEOTNYxbPVd00O0UNA0Ph6UJM7DdrkkkpmfcgTUNd1+wOPfrbHeqLCzg9pb3Tg1aLp//OR+G998jORJHxYT722tt0rDZBHvHHD1+h73TwDJnkerYjFsDK5H9+81/w7uQRO/42uq7ze6/+W375xpcavwUdR9niPqhMqrrmSlfMdhzbpN3yKPKSVtsjzwvGs4CD4wvSrCAIYhzXpqpqJo2JjrZ0fwPPdyShrqxXUbau7NODIF6qHoSdLEXNdi0swyAME753so+pxKK377S50t4mKhLemx8yywJu9we0LYtH753wU8/dxXYl230po1uw7o1mB+8oYas3+3rLNJee9GcXU6hrsnlCkuai5a9r4iQDz5DAGmA2DZmlKfMspKLiB5ev8O/d/EXuDG5wkUyJsxzbMleFe4EqwJJ5r1xzaXxDDUWUQVGJz0AtnALbtbBsU4KDTEWe5lBVlEWJpsn95druBlvbI47Dc6bpFF3TeHLtFn//E5/hLL7g9Yt3l6jPh/no913uXB8wm6W8/sMD3nn7gt6oy8l5iGnqnJ9H5HnFH/7ghIOzgMfWXZSu8fbbF3z01hqaBrMoo6yEqJemBUlesrHRIgoiHF8sm8uixLZlxz8+m3B5OqaIY4q8wHBdqqpifv9tmcyLTCbYqsQwDXTLbib6RIpaY/ta5AV0hgJF601hLGTqL0JBH87OQqaTkGHPpe1bDHsuaZIzj3M8z2Tv2gjTMkjGE5nWF6548Ux24YvY3MX0vEjUq2vobqAbDZ8gnhFMArmOxxeCWDgtSCOZ6LtryyZmcj5FKR3bNgiTgj98+4JPXu/yhSfWmc0SlKGk0C9Mg2DF7jcs6ovDVahN+T7HvmiKZtvS+Hji7V9VIrnT2n3hMjRHmWVUpcTnDkZtvvOuNCJK6dy61ue3futThGnBPMro9dwfef18sLyu0xPyXbdLGaWUQQpJgmrZlPffg1YL7Vd+A90xMH75V6hPj8gvQ2kO3nhF/ISPj9A+/XNSEDwPioL8zQfkWUkQ5kxnGZal0HUNxxFIvqzBbBiZWaNPX7jUGZpGS9exdJ3XooTLomwkbVKEy1p24TUsC/1iWs1qyY/f2vDYuN4HpVOO54JWdNowm1HOE9K3D8HzKI4vqbKC7ZsD0rRkq+fQUQqvmXijhgVcISvPo6xkVlZUtRDg3k0yzvKSoSFmORKVK+Y5cVXTs02Gtsm6abDVc8iijMFmm/TRJVq7BdMpyVe+Rv3a96jfeBPVa6F0jdu3ehhogrgCYVUvUYu8rknqinFRLkmNI9OkRAiEVdPw5I1qoK5lb+/p+tJCd9HAeJ6BbhlQVcLB+IXPkb+zT346gaLAuntt6Y3wYT1sZWPqBkrTmacpQRYSFQldu81FHLHTWuMLu5+mqmt+/daXmWZT4jzh125/hu+fvUJWZURFxGP9u0zSKV27jaUM3h2PqauaoihJ0gzDFAa+59okSSapc4Bu6CSphBLZHReqGqUrdEuY9dHRTArUIm0ur5Z7eN3QBTova9lVN/vsNM7Y2VnjI7evstfpEOYJfadD124TFTF1XRNkEb7pcRaFPJhOeeL2FagFfTBMY2VWkzU+8osEubDx3M/EpCY9DUS337GIwqQp+jVOQzZ0FgmM6116nRZVVbO+OeD0ZIyOTlGVvHTyQ146e4nXm3Choe9x7frWkni4DLmBJrSnFC/7eb4s7H5PfPUt01zK8tI4o6pq8qLAtkyZHCukYDVGTspQlGXFyO3x1vgBX7r2ce5PHnEwP6WsS+4MrtKzP9xRywC3Njs8sdPB80yUobh+Y0BRlJimzqDr0u+7/ObPXKGuax6/2md/kjELpWDcO5hgNPesX3xqjc2Bt9TR7+9PMUyDcBYyOZtIYpsjz2VVVjKpKxOl1DLtzt66IpN5qwdee2noUsURy8jghW68zKFGrrkskYl13uzyDRO338XxHaqywnFtNnouPd8izkr8ls1knuD7IhfMswJ30BepdqsrjcNCmraAyMtixb5/X6Z7dXxfHtdZp8yylQyvOex+H9MyUbZDd2Mk0tWWy9mpEAI7nsk33zzjL96d8vL+TBLtLIPd6xvyvRaWvpq+ksTpDQkvCcQqN09kVWB71Gkq558Eq/NI5bWrO00iYA2272GYBo5jMhi4bPRcvv9wzJc/ssXDgxn3T+akecnOyGd360dfxx8sr3v1B3BxAWdnFNMGqslztN0d1EAS7epvfR31xS9Rv/smZBnmwJdfLo2pv/L7lPuH8Nr3YDCAXg/yHPPmHv5un956i27HIstK4rgUzbppLOF1o5GH+c30uyCNLaDzqq5JKrHIrZu9vNk0A5dFQVRV2Jq2TIzrKJG7zecZmqlw9oaoW9fIXn4DNjYgTamKkosHl9Svvc7xy4/QbROj66GUztZeZ+mh32lY9i0lDHejkdxVNVwUVVPYJdv+KCs4zQvxJ9E0PM9gYCgOohTT0NnoO5xOU46PQtJ5gu6Y0G5TXM7FkrfdRRsNwbLw727yzrtTSgTJAGkyFtp5V9casxv5uqfrnOR5YyZUNwRH8di/LAouioIa2fNraMQN6961FI6jsG9sQl2jtXwu/+m/oZhGKM+Sv2dRyLT/IT5ePnuN+9N9yrpilqSUdUVW5mz5awxdj7hI+R9/8M95anSH/eARdV1jK9HOr7lD/qfX/hUPpoe8PX2Hnt3FNRxsZfHvP/ZTdLo+w1EXpevM5xHzeUSa5UsYv90wZTstD8s0UbpOuy0hFNVi72w1BLu0XHm32wpMRTXLyAuJZzUcaymz002dKEpIigJLKZ7beIz700Nc5ZBVGfMs5Cv37vGdo3v85evv8MRoxF6ng650PvHsHWlCXEPMb5S+2odrrJzy0gbCX5jazDJpCppUubbvoTyL2SSg1/ZxLIvJLGA8nnNxNsG2ZY0RNhahPauLqQzWvD4/tXWVe/ceitvfQgWwLBAN099qOEGRsOzD0zloGmmYoLkGqm1DXlJME6owJ100V0p08zKJWSil88LuDpYyMZXJf/fdf8s8i3ANG6Up8rJYomkf5uOdkxkPLiL2Ri3iIMZ3TNbX29zd6zOZp1RVzV+8fclHbwwJk0KSNB2DJEx4680zTucZcVbylw9mPLMrpDpT6Tz7xIbsmzcG2J5NNI+4vAgom0JnWqYU4qpa2tFquobZrKmoGplbmct0DbJfdprUts4IpqdUZSXGOLYrvvXxHMqSPM0xLZNO1+VzL+xx772xKFlCyYP//rff4eRkztnhGf2Bx50ndqCuWNturGz9/qqxACEHWq68X+jbq1IepyvZh4cTMExUqyPBU5ZDOpli2qZo6s+nxIEUXce1SdOCOCtJkoL904DzWYJh6Ny6NeLhW4erQp3FDSGwUQGYjqwQNG25yshOHsljF9n07WFDKjxvpIqNaY7bXk72hilNma5r9DyLy3HMP/pvfk8IillJVlQUZb1s7P6644Ohe12nPDimnAbygxwDTJPy7QdU0zna534ebb0xMnj4EOoafSDFvHrlB8SvvIsa9qiTBB49Eub9YCC/53mA7pq8uj+j27VwHEW/b6PUyqVN1nYii0uqiouiYFyUZLX42280gRcakkEP4ohXIha5aV2xZkpzYGgiawPodYXooLVbVA/3sXaGaDfvwrVrWGsdNp6/wptffYP+mo/ybKqsYPD8VbzbG+xt+c16QCD7oHEbmxRSIEemyO6SquY4K7E0mJbii1/XApW/MY6oahiZBtM4x7QUw5bFZZyTZxIHTBBgbA7lRqxpaC98ivL0kvxsLra9dY2l62QNdyBoVhd1DWlV0VHiGujpOi1dX65AFw2Jrem0lWJoiLzOVzLVdw0lxd7Ssbf7MrHnOdrtO/Q/eQf31haa0snffk+QEPdHw0UfhiMtM6bpnHkWousaQS4v4K8+eJHt9jqf3/sUP3v1BQZOn3kekJQptmGRVwXH0SkPJhMGThdd0zkIjtE1Hd/0CbOYoAm2ePfBEf1+G8syabc9TKVIs5wgTNA1nSwXZmxZVQRRTDlpmPeWErvWrARbYfTcVbGvKuyBRx6ntDwJyDGaa9xQik7Hp2Pb9J0O5/EYz3BoW23SMiMpUn72zh3+9de+S6fj45sWeVXxm0//FM9tXeVjH70rcLehS+FeFvh65VK3iMWdNzrhrCnKRY1pGOy/d4xjmwxGXYIooeU7tDyX4/NL0qwgTXPO4wlrXp+W5WMrm49vPsdZNObexZH8ccp6xeZ31DIqV3OUMOsNWWPopi6SOkCz1FLFoHsmuAZmW2Bn0zBwfAfPsUnSDNs0uLG7wbXuDtM04Ln1p/jF209xo7dLy/I5j8d07Ta77Y2f2PX44x55XnE+TRiHKY7ncHQaMOy7fON7BwRBxu/8zJ5M20pjGmXMk4JWM5n3+j5v7I9xLcU4SHnzRF4LLcfg+EIKS6fjEh48otVr4biWhNkAaZJS5IXssvNC1ihZQVmU1HEIaSxEN6+zlNphN6mGXluQv41dqlwKuqYUynZEWmZZ9Nf7KEOxvu6zfx5imoqb6z7n5xFnJzN2rm9xeXLJYH1Ar+eQJAXDqzvcvDWU77OYnBfpeV5PivvCY17T5ONw/FftaUuxjI7DGHJhz4ezEMM0sD2b+WROOAspywrPNdnouvS7DqOuw6fujCgKUQSUeb5i0S9IgVkkXgIgTYdhyXNhu6vz8jvilx9OVqE47SFYjpj4ODZ2p02RF5i2TPN5XjIOMz5yZ40nvvg5djdaJElBkpWsdx1pvH7E8cGFHqjTgjqv8H7hU1RxDp6HeuwWuudQf+tr0O5Sv/ka7OygDQZon/si6Dpap4375Z9B+w9+B4IA7Ymn0BwHTk7A97E2uyhPCu48yEmSEtPQOY9z4qpa5qxLWpjsnpOGVW42E3tey86+pUTitvgPpOBbms5JQzwDccfT0Dg5janCjOLwnOkPD9B+/e9S/8lX4PAQul3UtT0ezVL8zzyLsbeBfXWdYhZTTGM6Q49b6z62ptFR2jIWdmGoc55X7DQWozXSDGyYioNUzvcoE2Zk1BjWGJrGDw5mRFHBWtsmSUrqsqKezSnPLgU5yTPqP/9TVL8tTUBzCKIhBkLG+9YXC7//uBJyojjnCds+KEuqJvTH1DTiquIiL8gqgfNnDbdhOHAw19rCrRiNqN+4J41aWa401dOpfO5DfJi67NSLquS3n/kMVVWy4Y/4uWvPAfCNw28zdAa8enGPTW8dz3D56e2PY+kmjmHz957+Ob589Yt4hsuN7lV8wycuYnbbm6ytC9xnWAZZmpNlOUrpTOahuAlWZVOgReaUZrlMqKYuzPDLVCxvfdHWF0nWWLyuXrC6bRDGou3NiwLbs8mLgqOjC2ZpyjSZ84PTh/zazV/gKw/+nD97+EPiImW3s8HJ6Zj/6Pmfwbc8Bq5PXKQ8mp0z7LW5fmVTCvzCdGaxr6+b90OnsXxsZHc9W4qvBvk4gqImDBOqqsIwFG89OCTJctYHPYpCVg9vXJxxmUwZul3SMuX7Zz/EMxyOg6CRabFyx1sgGgiLHtcQ58C4oJplMv1XNXWQk84TaoR0Z9omeZySzxNsyyCJUkFVbIt222Ov06Fj+/TsNm9PHtC2fBzDxlYWVVVxGl1wGl7+JC/JH+tI04JHhzPirGS00RX7047Npz6yzea6z//y4jG/9PQaf/76KY6pyIuKT97s4bccTFMnzyv+3gs7tF2TQcvi6loLTdP46K0R7Y4rtsS2z/RiShQmWJZiejGV9VQQUKSZwPd1TVEUVHm+NMGp5w1r3LRl51xIUI1SSiBrAF0ahbooKJu1FmXJ2eEZSZgQxznHJwH/1Zfu8Aff3CdLc27eXsP3LYo45u7dNdyG0KppGkdHc2zXRnn+Ep4XVKi5nqpS3hYe+As43WmJjK2uqGeXjQWuIwiHZRKOZX3h+q6k+OUFp2ch5/MEzzY4HUe89GDMYOBxfh5I2MwiHS+erRCEOJD1woKvkEQyoS/4A0kk5LsFp0ApWWsEY2zPpQgj0iTFtE0c1xbEurmPPDid0+87uLZBt+tQFBWv3L/k8jL+kdfPBxf6uka1bHnxRxHGCx8RQp1loX3yZ8DziP7ZvxAZnu+D61O/+E3wPPTf/S/RTBNe/Q5YFsXXvk6d55TTAI6OMK9tkexf8PztPmVRo+ng+yb9xrfdaabLtJa897iS4rTYLw+aDGmxgtUxNJmqy1oS7+bN5C/7ctld19TCejeEsm5sDmjfXKf+oz+A0YiTP/4h+f0Dgj9+kZ/9h1+GMET7xM9QXMzQHRPdNbE2u4RhjtWsFjqGvmS627qGp4QEtJ+WjR998/VGSz8rqyWH4CTLl+l44+bmOFjzOH40IzuaYHz2M2CaVN/97vJPEkwTkmZNUSPEukUsb1ZVS/lfTS1mQcj93GukgUUjDcwqISh6SnT0C3vgnmHQd0zcjkMVpmjtJkbStmE4hM1NtFu3KCYh5fkY+v2/we3qJ3+YumLgCuPeM1yu9XYI8hClKW52rwHwX3/99xgnMyxdJvnvn71CUZd8dvuzdKwOb0zeQEPn5bPXyKqMy2RKmEc8vbHO+emYu7d2mc0jgiCm1fIYdNvimmdZrPW7xElGnuWQlyhfktvqqIC2yYJoYdmmXMxVLWE4cUkaJlRZSZ1XKF0m2bwQdnlRlHRsm+32Out+i++cvsj13jb3Do65Pxnz1fs/4B/9zt/lIh7z2d1PAGDoCt+yuNnvM59HK1heoJ5m+mlulHUthV3XVi56NZK6FzfQf1UzmQSEUYJtmURxiq7r9HptptOAR2eX3OnfRGmK75+9TpwnpGVGECVM5+HKc39h1uM3BL3GHreKhS+gd6yVBLAhCVZpsfRG0CyF1/fJ8gLLMXEdi47v0h90eDidYuoSeJMUKR2rha5p3OxeZZLOBaGxPtyoFIBtGwwGHkrX+PjTm+R5hY7GsGXx2bsjqrrmv/93bwtHouvimIrv7c8ZDFz+81+8zbDv8n+8eYmmadw/DQiSglmUcTpN6HYdgnlCd2sNpRRpnKJpMNoayYTb69JbH1DXNVmaQZaJWQyIbK4qpYjZPkars/Sc1zQN8pQsFD/3qpDIWNux0XQNZVm0e20c32F3vYVpKv75D47p9x3mkzkPH465vAz5xBee4eBwxi89s47nmbQ7Lp5n0eq2hAy3cL/TlaAKi1WCMkRPH45XbnqwkuIVmQwuQDoP0HQN3bJJZnPiMMa0TLIk4+I84CNXexhKxzQVjw7nbG+0pAnKi8bHP4X+lqwsOkMJ6knTxjFQAoEMvyVe96Yl5xmMm4Cb5jG6whhtStPR8vFaHq7v0m7bWJbBx24NsQydtmOy3fc4vAh5/sYATYNBx8b3zR95/fzfsu610RDNNpYGKeQ5nJ1RP3qI5rhYo7bI7wCCGdXDfZGGvfQ12NimnlzCZCKkLpD3e3ugaZh9ye/O8pLLy5TLywTXVUvy3H6aoaEtp/u4knk9rSSkxtcVI8OQiG5Np2c0MjhDIG29+T5lU9xahqJrKGzH4PxgRnk5RbUbk4FSOAIXrx3z7VfP0O48jvbUs9T338IYtKGsUHvbZGczPM9cZt0XjdwvaRQBlqYxKUr6hs7I1EmqistCkIjHXLuxoS2pGnZ8XIkW+MbIFyvVk5DR0CEfh9Rv3oPBgHwSUYchVRjj+yY9x2ikeqLlbzXe9L5SBGVFWymmRSluebpM7Z0mYtdXipZSYovbEPJKRFLnNnG6s7Qgmsaorg9PfgySBO3GTUFmBgO0dgdz2EY9fkem+g/xcXdwkzV3wJrX5zwZM89CiqokKmLiIsJSJl+6+Rg3e1cwdZOyLojyhPvTfV48e5GhPaCsCsIipGO38A2Pqq7wTQ+l62ztrInznGUyD2POzyZ0OgJdahqcHl1iGGrJKl/a4DZ6duVbmK5M6YZnrYJnOhYoHd0S4l5eFKTzBM+xsUyTfr/Nd998wLuTQzRNX/5eaZrz8OySr3z9JZ4ZPc5To7s8mD3EUSZFVfL02m0O5nN6vdYqRGahZV8Y0yhN8t1dA9pNTO08A0NncLvJvA5litKUhufapFHKxkiMhk7OxvR6LYqi5M3xO4Kz1RWGbjBJxUHv2pVNlk9SWWP6dqOP1yTFzzWW640qyiEu5DFKdvnKNqmKijxMG7RjxbSv65owTgjmEbudDnd7dyjrkhu9K+RVTttsU9QFA7fL44PbSx7Bh/l46mqfnZFP17OYRTlZVhKmBbOkIMgENer3XZ66PsBQGobSODgPKcuakWvz2bsj3jqacTqN0TSNO5sCLe82ITntjku336IsS4q8IM8rbEeIf3VVM9k/QFf6suEqw7kU16psYHgH5fkU4zOMThfGR4LaLJoo0xLmu2GSzppp3FAEU0F3/vTrb3N2OudkElMUFZ1+h9N7b3L66mvc3Orw7GPrfPW1c05OAuFLrfskUUI2bxQTreEqLGdhzLOA6kGg/fcF3/g7u/L1YCwIhKZLAmSW0t9ax2gMqNyWGOT8b19/QNRIOMMw4+g0IAoiems9UZfZnhT7qoYskwYji1ZrjKqkiERjL8+DtdLSNwgIlnhqVJXYFpdlSTgLiaKcNC34zLUeStdY6zqczRJeuDVq6C0ad7c6jMc/7kQPYr1ZVmh3Hqd++FAKvWnCbAZpQno0WXrh1/v76I/dEZLe1/4Ezk6WsXr5ZQhBgP7Jn6Z89R5sbuI8cVXY7Js+nbZJr2ezvuZKmAoytU4KIcvUdU1UStG0dY2+YeApKaRZXTFpvPAXsjcQln0F+E32fNIYzFRVzeaTm9RlTXY6Q7t5G1ot1q/1ODgOeeGJEfX+A9jchkeP0O4+hmYqykeyW3RdxZpjLUmBLaXjKa0J+lqY90iT0VOKTUvx61cHHGY5vSbxLq9r7qcZs0JCc6azjLqCyTQjTUtaz99cyhHNxplO77axdwf4vpjm+LrWoB3yZjTDV95M/IYmdsGblikcAWRqj6uKWVlS1IJwRKUEAC3IfZoG/sgXlCaeoz3zHPXpiZDv3ngDvBbmpz8h18CHfEeflAl5lRMXKbd7N0mLnKIqGDp9kjJl3RsySwPSUoh6h+EJt/s3CPOY//Xen3EQHmLqJoZmME3mHEXH3Opd4zKZ8IUrL/DM5gZRmOC5Nq5t0e216PXa2JYpRDpbSTLdAiZPSimSAKaSF3QlgUJFkkFayvopkX1oFeZUWUm37YOhUzRs8rIs+cyTt8nLkpNgxt3+TVqWx629TebziM9/6lnKumC3tcM7k32eGt3FN11ev7hPWkpR3Fjr00QsNo5i+ioMBlYEPUdB3+b6c9e4PLgE32hkbzX1OGUexnR6LaIkJSsK5kHEeDznib0tgcjrioHbxTZM2pbHzeFAYF0Q5CArKcoCykZ6aOoiNVxI/7IKc71FWQk50HRtyrwUEmNdU4U5eZJhNgx7pRSmaeC3XCylCPKQltmiqitM3eQkOqNttnl8cIvj6ISyLn/Sl+Xf+LgMUjzb4Gwa8+R2mzAULX3bNjgPcoImmW4e5wRJwcU85dZ2hyDI+MdfeYvLqODpK1IozicxLz0Ys9FzeeXhmOubHXo9h6P3TnF9F8MwME2RlekN6RjHJ1kQbzVddsvxDFp9dNPEcGyRqrX7En5j+5RxvIpwLculcyP1SgJZJzGGoegNO8RhzGPbHU5OAjpdFy722Xz6KaZhRt+3eP2tCz7+zBZXdzv88PVT0ZYPerT2rq787E27sd9t9OrxTOD6ha7fdDDWdwjPLuSxhiUqgHBCMR1jt1qEsxDTNglnIWmcsr3TYzQSPbyua/i+heeZeC2POGjsbKOpJPHZtpDRDWPlDJgncg55gtEdiDrBtFYBNnHDvI9mFFmGZVvNSszAMCWlzrYNsrJiHGb4tkFdw6NxzDwp+PlnNjiaJLjujzvRT6eUB8eotT71m69LWE1dQ1GAZXH/n/zveE/sSQcD4HloO1dgexvtiSeo44j66BCiCOsjj1G+84B6/wHFNEazHaLvvcNkmjGZpui6xstvjWVXr2uNn7tA9hqS814h+vSFbO4iL/CVxKr6us55XpLXdaN3Fyb+Yg+d10KIUyAJTnGGZkgTQxSQffsV7N2BsOW/9AL1/fvU3/hT0nePqM/P0EcDynlCnZWcniWsjRwcXTTziyS9hVHNosAu9PZRWfHNo+kyfGbLsrjm2uxYJqamcZrlOI5i71qXK3stPE9Y9wwGQmJsGMnpWwfkFwHjSbpUIy1MexYrDRAEw2zORz4W5EEHzotiqWbIGt5D0DQKXmOr++TNHu6dbYhjtLVd6pdfFOjetsHzqN+7D64vK5sFhPchPQ6DU06iC5SmeG++T1EVGLpBkId4hsc//uq/4lpvB1M3ScuUnt1hw13nqdEdPr59i6ouuUzHpGXKTnuDh9NDToaIXQAAIABJREFUTqNz/tUbL5KVGW9eXHA+npFlOZ22xyuvvstsFuI5kvtNXKyIbGWzl/ZWKXJFmkv86zzD8mRirSKR2ylTLZ3eksb5Lc8LiqJkPJ5zEoZ0bIekFITie8dvcWcwJAxjfvvpn+Pls3t8++RFojzhKDxl5PaJ8hTfNDk4Omdra7jUqm9cXVtOy8Ql2Lqc9+JzScn9Vx6Km52lo0Ye/WEHNXAxlGI2npNlOVf3Nrh7a49+v03PcejZHYIsxjc9yqpif3bOcRAwmQQr+1tNE+97xABHX5joLFz8msdVswxDKZnim/UbjWEOaYmu69iWQZxk3LqxwyevX2XLH7Dj77A/PyTIA5IipWV5vDt7gKmbXMZTmTw/5MfpNGEcpCRJwXfuj9nY8DENnaSosAydF7/5NptDn8t5itI15nGOqXSu7HZ57uaQqq754f6Ey2nC5sDj7XcveeNgyptvnPLeWcDDBxcyvdc1mq7x6N1jJuMQZagl9L7ce5el7J1bQ6ikgBdhJPLG2cXSRRWQYtb8rzKUNA7Nrr+qKsx2myRORas+C3n3NOTRS99nfBmA6fDME+tcBimv7U9wXZO3DyZs9Fw6HZHkRefnBI/2l5a4em9tpdVPgiYKNmj4A4I8Fxcn4khne6hOn+6tOxiDDXB80tmcsigZbvTx2mLP6/sWHd/CNhWGLmuzJCkkx0LpMr2DaOebFSx1DXkqhEWzQY31xpyqkiaZ6enKmnfhcR/NltdjkRd4bQ/HMXjjtSN6tsWj4znfefMMx1J4tsH+WcDQN3jp9RPcRTrlX3N8cKGfz1HXr6DduIE2WkN3TKrxVArQZML6Tgftl34d7foNuLykPj2lfvRQdNf37kEUgePI23TakHsq7Fs71FmKe3uDW//x53BsxWDgcGXkNR2TwdAyWTdNOoZM95au4epak0onXvZFQx5rK9nlb5gCYS+oTHldN+hAQ/DVZWe+se5irrUpLgJJ10sS8ouA6Q8eMS1KgahdF21jA3PYEg+AOMZca6NaNtdudHl0FDbBMBrtRief1bKnn5cVI1NxmJUkVYXfxMqKQU5FWJaM82JJKFy3TNK0JJwljS64ojo6kVVJu02VFWi2hWYZROOIpBS431eKNdNcFvIFkpHW9VLKt9jRF7WQ9gaGokZS/6KyamR1MC9KOo7BWiN3RNOogoj6nR9SHx6hPfsxKfS+T3nvbaq//AZkGdpPf/7/2Z3qb+k4Cs7Y8te40b3CuruGoRvMs5Cu1SUqIvqDDh9de4ahM2CaTQnykJP4lKzMOAhOmWRTlCaTd14VdOwWhq642e/Tttpc6Xb5L371l3A9B2UoNtf6WJaQhtotF+VbqK7TONEh07ISyFsS2zTCeQy+SRakYOkCWy9ulEWF59hkeU5RlrR8F8s06PfbDByH0zDgSkfIbqdhyLcOHjELIoGoLY9rnT0GbhdLGbwz2adn+5zMA+7eucLb7x6Itt9WrK83XIuyId8FubjwzRv0YRFtqzRISspcmo1ymlCEKX7Ho9dpEQYx3V4Lw1Dsz2YcBqfYhsksDfAtl67tMI8T0jRbpdG1/i/m3ixIr/w87/udff32rfcGGvtgMJjh7Bs5IjkSKY5MS7IqUUpVipJy4ji5SFK5SHKVu1SSmyTlShzHjp3YKqkcFSlLpEhKo5G4zZDDwWzArAC60Y3ev307+5KL/0HDTokTW46pOVVTU2h0A90fznfe//u+z/N7BIcePwFZErv5e+uEgp4XJ4k4GI187Ip9ctAVfn+B7J15PrVKibWlFr4fMosiLM2kF3Tp+2NOldaLrAOfv//Gn/Hm8Q0kSeLRztWf4R35l7u2i7CXBzcaXF4RKWW9kY+uynQnAZ2VFqfaLsc9j1kQ40cJpiZTsjTGXkx3GmFqCp26LQArozlHRzMuXerw8Ok6tbrL40+dIc9zdFOnVCtRqYpCV+k0UcsVIS4DgXrVTXHYVxRRzHSdJAjvF1rTEd2tPytEeeIgliYpkqaLTtXQsV0bx7XY3+nSXmmjqzIEM4GWDWa4loYqS5xZLFMuG5xbqfLOzR694wmKquC0W0LUVljostlYdPH/vO7Crgg7370RuSSLw8p8ROrNGe8fkYwLQaauY1gGs4knxuZTjzhOidOsWInIXDrbJMty8XsnqzgHDBupsJYD921/98KBEMVb7OhTsKvCcpilYuwfRyAr+OMJTtkRRd7SadZtFlfq9IMQTZO5uFYjSTNsXeHjW32+du0Q3094/vLCT71/PrHQp+MZUmeB/M4d8s3bZF5EFiRiZKtp2L/4LIwHRK/8gHw4Eorww0MktyQ4980WUkXcHN61W+LNmSSk3QH0ekjtNtKFy4zGEYNBgOfFWI6GpspM4/v+1kmSnSR4GoUVLM85yaFPiyJ3b5TuZVkRkKNgFaN0XZaYFyI930+IDkYYD27gb3WRbIc0iDHKJqdrFtf+0Q9JDgegqMin1kj2uiS9CZJtIymi29AUIQAEGBeku5oqM0xEmt1xlJ7Q+BRJ2PDmWcZRlLIZxBiyzG6YsB3G9OKEkquJZLriAZZMfNKPbkG5jLLQgsVFFNdA02S8TAhxolx04jkIrK8s0Y/Twk4nphq6LDPL0hMq4DARUw+98P6Pi3Q/P8sYBcJ/rOsKtFrI587AoIu0tAj3oioPD5EtHanTRnr0cfjo+ic+oP6qr67nsV5a59A7ZhJNGIdThsGUIBXK7d+8+jxZnvH64TtMohmzyONwfkRJL5HnOYv2Io4m9mwf9AUTXVd0ZEmi6/do2i6PLTxMv8izjmNxorctgyiKxcMtTk+wsqgyum2gKmoxlk5Pdt0oYl+uyIoY3cP9DHnEy+/5IWmWiZS4JOFMrcN73WNs1SIDOq5Lu1nlv/nm7zAJZ9SMGhdqG2yN95lHHlWzRKfk0iy72KYh1NayxPHxUBTWe7n0WhEVm2T331gSYlcfpHDkUS47ovufxswnHooik6YpcZyIsJQ45vZIrLtKukNZF0FCaZpxdDwUQqqCb68qCnJZR1OFyE8u6QWpT70feANga3hj774jQJXEKsRUyOYxo8mMOElxHIslt8qp8gp+4rPgNBlHY2zNZHdyxEazzqnyMk90PsMkmvybvAX/f7l0XeHx0zVmfszYT0iSnCBIOBoHxEnGgxdabNQN1pfLeKHYsXenIV6Y8O6tHq2SzkLNxjJUtu+OqTVLOI5OXLyurquzULWIw5gszURoS5qjGxqBJyhwgBCNKepJsTYsA2T1xP+dZdn9jhXux8hqOrIikxeHuziMCf2QKIw42u2yttGhu9fFNlSU5XM0OlWoLfHnP9ziuOeR5zkPna6zdThF1wWRTtVUQq9wIc0GYkSuF4E2SpGgl0QwOhTxsPcAOnkmPjfywZ8IiE3kw7gr1PBwYlUzbIPd3TH7B1OSNKdkaTTLBq4rEuWS+UxMNWUxxVAKxK+saZBngnQnyeJzDAvCAhqkqpAl4vW45wgocuiRBJJYkoW13A8TfuGxZT7u+TRrNlGc0h/5XN8ekuc5h8dzfvmFDY4m9x1Z/+/rEwu94loii9h1ife6aIt1kUU+mRB+fBemU/L9u2L38sQTgvaVpuT9HtRq5PMZeb+Pf30L+9KSeGM6DuoLL0ClgvT4M+Q/eQ1VkRgMQypVA7Vq4/sJrqqgIuHKCg1NYUnXqKoKQZ4xTlLuhskJHOfeeHycZCcIWICqqtBPkuL5mVNTFQxZwvcTcShRhEgq73UpP36GzIs499c/w6O/8STal75Ivr+H9903kVQZSVd47R//COOJBwE4e7lJU9NoqCqXVkvM0uzkxQyKHX1JkfGynOMowZYlDqKUlqbQ1hU+8gIcRWZJV6moCps9j95QsKeNkolWc1BcC+nBR0j3jpBaHchyDg48gUUlR0WM4LPiMDEoDhxK0cnfEzVKBVkwzLICbyucCtnJ9yuKfdVU6Sy6lB5eE9MYzyPf3QHfJ7/+ptAMKArS8pLY1w/7gpHwKb4uNZY48g5ZsNvsz45ZrywSpzFhGvIP3/kuaZ4yi2fUzDJnKxv4Scgs9hiHYxRJZhQOmURT3jn+kCW3RZwlOKrNVzaex1Ztnlt+jB8fvEm16jIYz2i1qlSqJfwgwrFMFEVYwJAl9JqFpAlffeyHongZCsRCWU+SQZyRBBFKSbCsHcdkPg8K4I6CYxsoskwYRKRZRtupk+Y5O9M9nlw6Td/3+Zuf+zx/83Of56WNn6cf9Hnj6AZ5nuHqDv/DH/whP7d+GYDPXD1HpexQapZYWWmJopnlokvOi87eVEVXf08keM8tUNaZ7A7F+LxmoJo6e4d9uoMJsixTKtu0HIcsz3m8/Sibo11czeXuZMDRQR8USXTxxZQgiRMxzfL/xYeVXAS4SJqMZKviAKQKUZhc0u+H8YQZkq3i2hara22eOL2GIit80L/FzfFt4izmo+EmhqKTZAlV0yTOYnZmd3E152d6T/5lrqWlkqC52jpbx1POrlaIY/G+/cFrW8Rpxv4kYvdwylLdxrE0ojjjzEIJz4vZHwV4YcKtuyOiSOzHTy2VWWu5HE8CJAk+uDMkyzLG/THlWolGwyIMohNgCyAKaLkp8uvjRBTaLDnZw9/rlIkC8tkYyXYFYMe2icMY3dTJQ2EbQwJZEV1+HKcnUaxrZ5eYTXwe/MIzPP34Gi8+uowXJrx1q8dw6KMoMr2bt/ilnztHMp/RWulAcxUMh8bKwn0F/uRYFFCnJmx29zps3RJ8fNOFcE7UE0x8DBsiH28yE8E2gGkZtNsuuq7w+Yt19npzYdE+mgmULsWhQFaQFGEhlBWZrHguyrIsCvu9K8+KcB2/EN9l962BmlGIBaWimzdYXi4Thgmv3+7z2396G1mCNz845uhozmDgY9sizW7zeIZr/mVH97YNgwHp1k4RVZphrLfBMDC/+KzYzR8coJgaeb+H8pSw8TAeiz1+vw95jv2LnxUxh82yEPcVgRXZH34dFIXlz6xQrxksfe4C070xtq0VClrxqa6sFGE1adGpSiwZ6smIPis61LIqlPBaYWe7Z6vzM4Gf7SciozqMMrSGS3xrB7ViIXUWoFoVYS2rqxy/coPx//o7pNt7xNMApV1H6TSpmirRWx9grjawNtoME0G7e21ryDnTIAP+va9eKTj3wmrX1BTKqsxWkJAV4sCnl6vc9BPGScYgSbnph2iS2LM7y1WCsY9/+5joYEj+yndIRh75228S7g4JQhG8qxSvgyFLBMXPX1JkjKJTj4rxvVN0+nGeU1KUk9fMlmXamkpbEwp+VZIIwhStUxYQHEWBep35y28IxX2vh9RZYHZtE0YjUFUxptre/pd6UP1VXYaq0/UH3B5tk5GRZCkrpQ5BEvJfPPkrWKpJzxcio67f5XLjHHmeczDvslRqszs7JM5inln6TMHJt8jJaVoNBuGQf/DutygbDhvri5Qdixcun8f3AkxDQ1FkoRNRFYyaTTQRGo+T7lSRxO7+Hgo2B0o6ki7WK0maoqoqpBmhJxLYRoMpmqbSHQhu+/b4AFtVaVp1HM1GAtp2nW/fvsE/vPF7fDi4zfvdI5pWjYYlpmtvHNykZdtcbrWYznymc59rr3/Iw4+cB1nit37rS0KApxX2u3qhdvcTcRAIUh55+pLopLMcvISkO0dRZJIkoeraTMZztvtD3j865uu3/ogwjfh4eJuj+Zw4EfAfRZaRVJEyB6Brqgj/cXQx1bjXFRYC1jxIRXEvXj+J4gFhCw6B6H4i8hzqpoutmiy4Tb5z++0TdX3LbPL20SEVw+bOeI9Fe5Hd2f7P7H78y155Dte2R9y400cu3AUriyXmYcLzz2wQxCndWYwsS5RNlTMLJfrTgO3ujNFgxs7RlJ2jKVfONDi7UceyNG5uD3EKjrvvJ6wulGgvNSjXy6ytVej1vPsEPAShTXMdiALBnM+yYsxaKMtlqQicscR42imLry2U4WkUEXk+ZrVMMJ1j2iZJlNBcapLnOW7FJUkzFEVmcDRAVSU+2hryO9/+kHc+7NLreVzcqKOqMowOeetWl8Xzp2m2CjiNptPfOYByCySJx371K/eJeVmRomc497GzeQ7NNdHtJ5H4eBKh6jrhbMbGxVUCL6TX85hOQ/7+n2yyvz9hbzBn0BVTINmyRRevauiGjqqryLKM6tgoparY4duCwKdoGpj2/bVHJqZ8WrUuPpYmhfUvIg5jGg2b6TRkNApYrNsc7nZ5cKVC93DEzz+zzq33dzBNlThOmQYxP37/6KfeP//fZLydPZTVJTI/EjtaTSPZ75H3++SDIcnd4g+fTsUpRVUhz0k2d4k395BWVsXvVyrCQ6/r5Ls7SKunCsFCjHZ6mZVnN7j+tXdI07xQZ4rM+SyHwzimoihFPGxGjlDgR4UAzSg6+PtrUIlenDAr4C9BljFPMyqKoOSlec5kdyS+xjZEEp8ksXN3iveN79H+8qPYp1sonUbhBY3B8yhXDJKxz+TmER9+6wOurFf47EsP8MWri/jFCmH00SG2IjNKMmZpzk0/ZpbmPOwa6LKw2PUHAY+XDNZNlQVNFQx6xNTh6KMuSZJhnWmjdypkMw/jVJssiDi4OyYuLHU54j3Wj8VkY5qKQ5BVRNLGeU5VVZlnGZNUjPoFwEfGKEJvDFnQAu/hgSsVA71ThbU1GI+Jf/wWzvNXyN95W3hCNR334XWQJNK7+6KbX/jpe6FPw9WwqtydHrHottBloUqtmiV6/pA3Dt/BTwK2xrvEWczd6QFtS9jHdEXleD5gEs4wFZM4SzAUgzRPSfOMrckdHm09QsOy8JOQF9bP80tPPczXf3DthJinKAqWqRPFCeHERy+b4qB7r9D/c108ajE2lwQwxjJ0snGI54fia+IMzw+p1EokSUKaZtw+7DKPA8qGAGpUDJe9/R7/+5uv8OLGJdpOlUW3RZxlJJmwDbbbVSZhyK3+gN/97uuc21jm13/xef76S88KPnyW8+FRVxT5NBfF/ciDMENbrYCp0jjdYjKeQ8eGsobRdMBWCYOIKE7Z2jkUqv5KiS+eOY+h6iy6LcIkYmv3iLkvup17PvjIj0CWiIIYCWHXu6fCr5ZdSDLyKBXEPEC2VCRVIvVjjIolCo6h4FgG9YrLhXaTK60LGKrGD3ff44nlDf5051WiLEGVNZ5cXqPt1JnFPqNwSMUo/yxvyb/Utd5yuXswZX2hfDLBK1kaYZyyuTsmz2HzcEK5bPDHP9nlYttCVWQsXQTUbN3ucWapwv7Ao1U2cV0dSZI4HPn88tU2uq5w2J9TqZg88+Rp3n1rhzBMUFVZUAd1TYRpBSGKUyoEd8WuOwrEr5NYFCzNKD4eoRkaiSeCa1AUcUiYBzjVMsE8QJIlxv0xw76w2Q3nEc2mDbMhNz84oNm0uXSpw+pyGW8eEMQpuiojrz3AzQ8PmE/mfPD2FlalzBe//DDnHjknUutCj+07A9HNg/hex13R7btFxK5bE9OH6qL4vNoiqLrY16satz/YIfAEa391tYLjaBiGynASMBlMBA5YEtMo0pRwNBRQoDw/sXqmSQppjFtxBWM/ikASzwZZN5BVldjz0B1H0PTyHKteQzM0LEuj03JYXCxx42afZ587z29/52PyPKdd0rj8yGmqVZPJJODWrT7JvefKX3B9cqGXJOF7n05RWnWmP3xf5LQfjWEyIe7PQJZQljtk+4cEL/8Q6cGHAMiTFK1TJX/zGvl4TPT+piDr5bmw4n30vmBXbx6Q9wfceuUmtZpBZcGltVRC12UxlZMkxknGNE0LMI70LxQ7o+hky0UErYTYmVuyjKPIKEgnKW+6Ip34x+fzmHCnj/rwZfHiyzJ+lgsanGUJdkAUCQWkZYFtU1tw8Uc+5XMd1s7UqF1e4vvfeJ/vvnvIginU57NZzCgRboEoy2lpCqYs0dZUlnWVD/2QG3OfXpxSKqJ3QeTEL9VM1n7+MlGccfzjLfIsQ7YM4u6YZBqwsFQ6SZjLyAuevujSVYQwUEIiLch3fir8+7YsYxZdvVJ0+VVFxPfqsowuyZQNlWbDJJ350OtBtYr+lS8hnb0geArr62K/BOTTGXFvKkb7+5/ubkggYQUGt2y4vLp7kyiN2R4fk+YZo3CKJElossbm6Ig/2PwTnl58HFVWUWWFRbfF6wfXmcZTbg63OZz3yfOcMA159eB1LjZWyPOcvWmPb71xnWrVpd4oU29WUAuKl4iTzIkmwclOHrhPpdMVDNs42YNKqsTcD8HRsC1DjDk1MR1QVYUwEvv76dRjezzmSnudcTQlJ2c69VivVVktLaIUkJjx3KduVSgbDovtOsezOWcbdc6eW+HCqSV+91vf5/e/9ar4nhSJMCj28Pd0BQUXv1EtgavS3+1z+4MdGEWgKYQTH4IUSZHoNKt8+ZHLxHHCux/eYRb7JFlKkIT0gzHn1hbxg0ggtSWZPMnFyq/4uWVZFsl4AJqC54dopmANSJIA5eiahqHruDVHZAYAiixTq5RoNCvYmsbt0TZRmvDrF7/ERmWNh1rn6dh1/MQrHBai0zvyu3w83PoZ3In/epcfJYxGPooksVC1uLk7IkoyRlNBlXvv+h4Ai1Wb3TvH/OPvbfPi5RZ1V+SYX7jU4VrR8e10RVHN85xZEPP3/vwO1YqJqgry2ssvv4dpm1iWhq4rxa5YFC98wcFH0QrSW6FwT1Mky0E1dDTLFN2zrp/E15qOyLzHsEAC3dTFgbWIerUckycfESP6iqODN6LWEhOoKE4ZjAKyNENVZFaaDrV2jTzLWV4TE4iltRYv/9Fb3PzJDayNByDPCP3wvtc/FtoCdEugc0tN2HpL+OzvMfrDgo9vuTglh8eeOU/oh9zdOmQ8Drmz1cdxdEGgiwKCyfR+gl2eoTglFEUId/NMiBqzJEHSdEI/RDUN0HQURSFNU5EjYJsYriu0DaqGVqnilB2qNYf9/Ql3dkYoisQXH13GjxJKJZO1061C156ztTVkba3KZDCh1/3pSaKfXOiDgDwRe5fwo7uUnroASYJ9aZn0eIDiGoKc5/sne4b8jddhdRXtynmkB6+IEW9nAeNv/y0x/u/3hZo8E+EVxnIdybGpVnQOjn2isQ9ZTm8akQOje7hEhCZoWlDyvDTHKuxtkzQtoBz37GYyZVU5sbjJRbHLc5G9DjD3EtS6I7r17W0IQ9Y6Ntr6IkwmKBWX3A+o/o//nVhDJAnT7hynUyL1BKRDaVbRCzVluaTxSMM9Ebd0dAVbkYrEOthYKXEUp1yxTWZpRk0VgrlTpoGMsMSd+cpDvPW716iUdVqPrpEFsXgtV9roZ1aYjYNi2pvjyiKOFwTnX5LgkbM1Hr3UwJAE7c7PMlxFuBYudEooEkyKSUiUi3VGqUi00zQJrWwK4V+SCBfF7Y+J//TPxComisjvbBHs9EjnEdavvkTy4SZStfqv8rz6mV9BEmKqOmme8s7xTR5bPIWrOZyqdhgGc2TkonBHmKrKNPT5wf6PWXWX2aiu0LKanK4uockav7D+AlmW4iUecZYQJOHJA3C11Ka9UOf2nQO8eUCapHQHAuARRnExalLEnjlKT2xhckGDC71QYF1VGU1V0TUV0zKYzoUQSdJkVFVBUxSSYj8/Gt2PV+16Q/r+iKWlJk8snaXnCxXxPPb5P176r5lGnnAc+AHrtSrzOCZNM55a3qDk2CBJVGslPvuFR4miWIzqrQLHa4r/nzm7DOOYSw+fEap8R4U4pbFYFweRKOVXnnuUv/eNP6PeKPP45TPsTae07BorpQ7r5SWOJ+J7jsO46PKKe9ixyNOcJx+/xMbaArZp4LgmURzjFNjfc6eXQVMIejOSJCWMEjRVwSnbwpmjibHpqcoCSZYyi+Z8OLzJtaP36PsCbbo7O+DV3U3mkc/nVp5kd3KEdc8e9Sm+/CjFcXTmYcLW4ZRL6zXmgTgQybKEaZvIssRuf86Fy8tomsyPN0e8tzPk8gMLWLpKu+2w3HD4lceWmM0i0kJ3Ua+I3bhtajTrNnbJpr9zwGwaoGkKwTxAlmViPziJg1W1ImM+igoBnCpIcRNhV5QUIZbLi5S3cU+8F2RVFWS8YqR/7/2TxCmOoWJqCrMgRl0+h23rdLvzAswa8T/9rafojXyiOKN/2KdcLxPHIuL16sWWeKEUDafsgG4VKY2yKPD3MLihR61dK9Twxn1oDWDW66KzjwKef+4s7755hzSOWVhpsXe3z/kLHc6tVdlYr0GeoTsO0Wh4f1oBQrsQhtQ7ddIkRVZV7JJNHMWYtgkSNBYaAIR+KCA9WSYcBCWHJE7w5z6yLGNZGpqmMB6HbPfmfPj+IbIsgD1//OY+cZxxuHmXZy+2CP2QWv2nJ4l+YqGP93qCux5GGFfPkQ7Ggn2uKORpTjLykB9/EsplMi/CWGlAuUz43deRzl4gf/89pEceEye/OzeFiMs0hdBrPD4JRAlv7dP+a09h6Qp63WHY91iomWS5SFWrqXJhmc2xZFE8Txcd9D3hnSnLuPdCPySYpdlJF+sUwrNZmmLIElfO1bAsVXABDIN06pMPxT+Y9MjjwkJoWQze3CH/8Ssc/+l10u4Q21aZH02RVAX72QcFzKbQEiRpThCkeF6CKUsYkoSX5vTilFGSUVoUI7e11RIXbIMUeHi1wlYQcqlm88LFJt6NHeZZhnt1jbe+8QHh7gCGQ+KdI6bffZe5lyAXpMAwz4okv/xkVA9gnW4VVNX7Qrwnz9Rpn21gy0qxsxfxtbokUdNUVixDrEpWG8VTpbCHhCHqUotkv0e6tUN0a5e4PyOZeCTf/S7ai58n2Tn8pFvor/w6nPUJkohRMOPRhYtEaYKj2bRtEa50d3rMqcoyi06LKE15cukylmrww/1rrJfW6Po9Fp0OaZ7SC/o8sXgVVVaJ0wRFFvjUIIn4vXff4LlTp2hUS9iOSRQltOoVwjAWSvLCO5/7iSicqozasIsHHaiGRtl81bM5AAAgAElEQVS1T/zkaZoR+KEY4ecZjiW8uP3RFMMxuXhuFVmW6E5nNK0ao2BK3x+TJCkXa+eZhHNWSh1e3b3JW923+J1rr3E87+OYBh8fHGOqKr908SGSLD1JxZvPfKIoxvfC+7jZHCHAC1MeWlwAGR44vYy+IvzRl65s0N/q0llr8fSzD3E8n5NmOc+tneZrL/+Iw9mMeeTzfu82v//xmwz6hW9dkU5G91kmMu0lRcI1DRqtKmmWYxg6EhJxnPDgxVNsrC9imwZazUbXVGI/xDR1XNtisVUnjGKWGlVszcSLA5ZKbbI8x1INJtGcMI3oegN6wwlpnnHt6F1e2niRzcIZ8Gm+NndGNJs2cZrx3KUWH++OOOrOWV8oo2kK0+GUumvQqohOfLVdYhYkggmvyRiawulOiYqpcjCJePxSB9+PxYpdlbF0hcks5ObtPhtn29j1KlmWEQQJlmsJ7rplQqmGaugis13TQVEwXBfVEKp6o17Hcix0Q0eWZSFMSzMs1yKORLANwGw8Q1EVGgsN0iQlTVMars6ND44wislWo2HR6bh0ahbD/oyXbw7Zut1j4kfUWjWRiCfB6qkWu925+P7SWOCmDZtRbySKu6IV/v8EJIlWuwTTHpWrT4sDgKrTPnuaoHuM02xw7tFLDGchYfeIpY1l9reP6CzWqLkGN272eOONbWTbLZgARdiSJiA3USgscoPjAaYjwpYURSFLBSSo1qrRaDoYloGqqSiqQhzGwm5o6lSbVTRdo9NxWOy4xHGKokj0pyGlik0YJhzvD5jNIiaTAMUw+fZP9vgv/4PnufPx3k+9fz6x0GtPPULqRUiVMrPvvoNSdohfvUY6mqI4OpIiC1W2riNbGv7tQ5LdI+LjCfie2C3WGuTTKfn+HvnePlKtRnQ0Ig9D8tmcPE6YH4zBNNm42EAu9h4LK2VSxOjekGUMSRQoQ5apqgrzLD2JjK0oghQ0ToV1LM3vx7WWFCHk87JU+O+zHOfyMuNxiLnWRNKFP33y1jZLz2yQf+8VYaGbTCifaRG8/KoYGcYp3V5A47kLfPDD7RPNgW0oAuCliBHtJBZ422maEeY5qgQXLYPMj1kyVNI0YzuImCQZC7/xRcqKwoWXrqBaOvOBR8PSyPyIh17YwH5gFenFl9BfeAbFNvC8hPNrpZMduy7dhweVFIX+IOQHf/ShcEEVYTYZOaWzbV75wRaTNOWei+uet1+WJSxLoVzSSaeBUNafPk0eJyTbB8Q7h0iqjPqFL6CfXyMMU8zPXCTuTcmvvY76CafIT8PVduroskLZcPj+3eskWcLXPv4e+7MuZd3C1Uy63gBDMaibDm8cfiCKgScSG6eRx+nyKSzF5NZoi1ujO5iKSd8fY6oGURrT9cakaYqh6Fy+dIrTjTqqqtBsVZFlEUYjF4x22dbQLINqvUSWZ5RdB8swsEydOE5EEE6akWYZhinGfKauEycpYRCd/HnPXThDkqSs1qrIktBlfO/OJl+6/AAfDW9hayY3hzs81F7hg/5tZFkmyhJ6gwlfuXyF127cpGM3GAYTkXkviXshz3OO+yOhGwhSIQSRJIy2y53xGLlp0fd9ou4cZgm/+exz4Gr85hefpVV22TzqoWsq49Dj33/pBZ5YWub55af4ubWnATg4GnDlgdPomkoWZ+iaiiLLZHmGrml0B2PefucmSZIymXqC+TALuLKxyh//+TW8ICTNUsEpqLrEcSK6H1PHdS2Gvo+fhJyvr6NICnuzY6GzUHWeWniMU5VlAM7V1gjSmFfufp+2/emeSgGoqsxsJtYN371xdCKQ2zmaUqkYWK7F+7f69CYB29sjjkY+WZ5jmiqaIuOaKj93Vuyrt/seu/05rZbNYBSQZjlJmhOGQo0/m0WYjoksy4yHMypV+2QcLch2QJqiGCalRvWkI600Kide+SzLBEgnTlA1QdgzbfPkY3EUk+c5rZZLHMU0miWSLKfTKXHt2g6PPn0WXVVYqFlMfaHWf3ezR+AFDCYhw4NjrlzusH3rgMW2K1YRmaDhqZpKpV0nmwyE0O4e914zoNpmNArAnwqVvzeGYMbnnlwDVecXvngRw1Dp932USoNhdyTG84rEly83eeyBDtWGyH03bbGiyAq7q6ZrJyN70zaZDMRKIPAC8jzHn85pL1T4+MaOWCsgRIq6qZNnAlSkmzq2a538Wywvl9F1hY8+OCRJMup1i6efOcP6aoUojPjCi5cJgpi/+/XrnLu8+lPvn08e3Y9GqGULLAtrowWVCnF/Rn4vPxrId/cgCEi9CMXSSWcB9sVFsh+9Bo5D8vtfEz/Q+QeQTIP84AC9UyUZeUjtFigy9S9chcEA59e+hKyrVGom/iRguWEJC23R2ZcUhQVdY5ZmOLKA1ISZ2FXfy95wFQW34LpPi8JfVRS8VGBfdVnih1+/jixL+FvHwg6YZkwmMek0wN86pv/GHeY3RDb5YHvIzbtT0lmApsqE28esLLuQZWx+f5MgSoWgzVbZnPtYikxHF3jejqbwaw8t0LZ0XnvnkFNFV3bBNoVDwLLIgeHrt5FUmf2DOVvTgCxMmG52YXGR4H/+O+S3bzHrz3nwqw9iOjprVZMkhyi/L764sFRiYaXMOE1OxvJVVeHx507xp9/5CEuWCbOMKBOwnDwXE5I8LwLLspxk7EGWkb59XSTUyRKKYxDujwi//g3SvSN8P+XmP/keWrNEHsViBfMpvjRZJc0zxuGMtl3iYmODvemUOI0xVDEV2p4ccmt0h54/Qy7siqcqLf7w9ss0rArf3HqZjJwH6hdQZZVX999EV1TmkU/FcJmEIS89eIWN6gr/8We+gqmqdJpVZjOfxXYdTVXJ/ERAjiyTcslmNJyhqSppkXsQhDF+GCFLMrIs41gGqiLjF8Xd0DXwE6I4Rtc0/q/v/IA0z9nuD0myhEkYEoUxiixzZ7zH93Y+outNKek217sH9Loj5pFHkqRcP95jebVNkqV86+33GE3mkINu6Pz4tRs8cPEU9kJZCPJkiS/9yrMstet899V3uXBmlSRJcRbLYCmcqa5DkPLD7W2WSiXeuXGbSW9CzXR4e++Ac7V1/tF7X+ft4/eYTj3+s69+CUVRWGzXIUqJin18nucsdxq0G1WSNC3gQCaqovD881f5vT9+TWgdppFg3BcIVVkW3v0oFiu+NM1QZYWuNxRsgSxFlRX2pj2+ufkyd8Z7dLsj/ttvf40Fp4GtmZSNT7+9rlaziKKUw+4cw1C5utHA90WxdC0Nw9SZTgMODqaMBxOyLOfu7phy2eBo6BMlGb/7xj5+nLFcs5BliTBMKZV0/EgI3IZDn8XFEo9cbPNvf+kSkiRRa5TE4d4WmfTZfAoSWPUabtVlOhifdOne1COJE6JQWPJAFEPN0E584Zqukc6n5KFYf9743jWSKGFvp4ujK4zHAYEvmCr9oc+P3trnqDtHkiS2t/p4gxFxnKJaFoNxwLkHVtBVmTd/dIskiiAOsGyD8VFP4G4XNwrxncwjX3gcRVE43DmE5hrT4RSpuQymS5QIjO3r7xwgSdA7HpOOevjTOeVamcsbDf6X79zm1v5YHDwf3aBSc+msdURYT3FJsoTliolGHorRvKqplKolzj+0we0Pd8VOfzoUKYBZdtLZk0OWZmRZznQaoioSvh8zn8cF+z5jMPAZTUO2tkf4M58/e+UDbFvHdkw2CpDSX3R9cqEfDESXN5+jlB3yXg/7sfOo60tICx30S6eRyiWYzdBOLYkxf5Qg2ybd17fA91EunBVM7Ne+z/ztO2InvrSEdm6ddKcQcqUp4a09srffQrY0ZENlPIm42/Pws5wN20SWoG6ojJKEiqIUmFlRZHRJdO+6JAsXUAGRkYDHH1kgzDNcRaGqKEJ1LsnC31wTiXtK2aFWM4TGIEqxazbOY2dRHIMkzliqm6QTAaaYbA8onWrAdMr6I8u06qbosJOczz/QZpQkPLFR57xl8nDF4Zs3jqjXDT7wQmxbpXZRqNTPmQb54SFfePE8OztT8iRjFKe0NI1kMKP2yBqS62Kst/He3sT3U37wT9/Cm4YsrZRRJRHkkxaeeMdWUVwDVxZ0wGGScm65xOa1PbJirF9VFYxibJ8BsiRhmgJHqSoS5nqT8MZtlJLN/L09sjAm8yLshzeQbYMsTmksuJiGirK6hKRrcO7cv/pT62d4bY52USSZvj+hpNvcHG7zzMo652qnWHSbPNJ5gKtt8TOslZuYqn4SdfwnNz5iFvlcaV5ARuKd7nu8e7yNKqtsVFYpGw77sy5V00RTNH60/z5780MMRcFWVSaTOYfdIXGS0FioYeg6jmUynswpVxw0VWHuByRJiqYqovBnGbIsClhSjCa/+sLjhFGM5GromnaizXAsk2bJoWE2cHUdTVeZRj5xltGwLJ5cusQwmDD0fWr1MreHIpaz53k8tbLG1niPy6dXqJYdJFWsoJ58+kFGoynPPHmZ9lqT0w+s8u1vvEZ7oY73UY9y2eFUvYYsy5Q7FfzE56WvPsvdnWOO5nPCIMKtuwyDOZ/dOE3TqrPo1nj94A6zmc9//3vfYDiYsLDYAF1BVmXSTECCHNfiXL1OluUYusZk5rG62GL7ziFpJgq70XDQdI0wiohicXjSdQ1NVZBlmQc6bT7q3wXg9YPrNOwqkzCgbrosuE28OKDVqtLuiDS2LM85Xf7pndCn5To+ntNuOSLUytH5cG9MuWxSLZmMZxGWpfHMoysYhsrGuQVkWaJSMVlqOnxwfRdFESp9gA/3J+wdTonjjE7VYjwNiNOMUsmg7Ojc3B2x3ZsjyRJJkuF7AVEQkWUZ7kKHJE6wHIv5eI5TETz3PMtRVCFE03RNeOYNXYB3kpQ8y1k91RZjdVnFKJfEqL/VgVTsr+NUuK1MS0SvpmlGrWax2HZptUTol9uqEwQJyaDL5s1jdF1hHiZsXFzBqZTALpPncO7qWYgDOqtt7LKLWm/x1g/fF1a+g9uojQ6Wa2G7wu42nIVsPP0Yw+4Y308YHQ9wllYgSXBcg/WmhWEo3Hh7h2AecP3aJr4XiTwAVRTqez+rJEmouvgZVV0lCiIUTeFwb1AU8gy11hKvUxQTBWJSk6ap2OvLEratcXdf/Btt3z6iXLVJ4gRFkRgOffE5JWFrdByNctlEVX56Of/EQp+HkRDj5blQXutCSRnfvivY6+9tCt45wGSCWrUxHzpDOp5TXqkK9XqrDXEsIAmtktjR+z54HuHuUEwIJlO0uoO0sUHmx2jNEoYhVOEbrsmuH2LKMmkqUuJKqkKai6haW5F58oUNwixjo2oJhC1iCuAoMnubQ5qahipJ2MULIUlw5nIT73gKbgkWFsQUIojIwphJd07aH+HfOiJOMoIgxf2Nv8Z4HOJULWRLJz86Rv/MA6x98RIdTWMwCPjj947RJBmzZLDQsTn/1Cq6LDOdRry4UCZNM7QzqxxFMZqmQJahPXiOh//uf4U3DlipmZzfKIMkicS67W3iwyFIsHR1CUWS0HQFreny+c+d4VzHFShycqyVOnffPz7x1lcUhfqFDj0vJigmH1kuin1dU8VKRJVp1E1Wz9TRdAXJddAXq6RTD71VQu9UUVsV0DTSmVDcz0c+9Y7D5E+uCc7C7dv/Go+vf/PXPI5RZZWKYRcIVpeVUofjgn//1tH7jEMhEOv5ApLTthuEacQD60vUzQqmYhAXudqyJGGpBqOCprY1OmZzOCRMIhRJomO3sDSNsmFQLjvM/YDTKx36vTGaqhAlCUkQYRk6YSQU5I5t8vhjF4mThLWlNkmSEicpUSxO8n/4vWvUKyURqWqbRAV975mr57m1c4ij2ZypLnGh1WQUBPQ9j1uDAbIk8V73gLgg9v3tR36ZOEpYcF2yLCVOY55YOsO/9bknuHBmlcl4zo+/f535PGC1XGZ5ucXzD18EwPdCOp9ZxfMCLjWXmB5NqJZdtsZ7PLV8mr/z6/8R/bnHmVNLXL50mkkY8u7xEf1gyN1Jn4phcHZtAT8UAsaSY/HlF5/gkStnUQul8tnVBV67fYc0E2lsEhKLy036wzFZLBK9ojimXnGpVQS50DYNHNuk2a5SKtnYmoataaR5iq2ZWKrBRnWRttMo0gkzVlp1ri4u8K3b15lFHjvTT7dzBMTDXFdlkiRjNI9QiiI8GPsYhnJirRK7W+G9PrdSJU4yaq0KJVPjynIJQ5MZTgKaDZtKxUCVi+jVgyl7213STKjxO1WLet1B1xUR1xpGOCUHb+qdhK4kkRjxJ3GCYYno2aX1NkmcUGmK0bIkSWRphlN22N3uinF5nmHapsBKhzFWpYxlG5iaRLPp0Fmq0T2eMp2G7O8OGc9ChkOf0A+ZjWb8jefWIRF/tyTB3v6EjbUqaxttqu0686nHzTc/Qm+0MQwFy7V47Olz4M8Y98dQXSAZ9ugslJnv76JZJt2+h+fF/Ke/8RieF1FfaOJWXey6WL/tDQOGQ59qs0pzoXbiWmg0bBbWOiytt5FlGUVRWF6tE3qhSAKMEjRDo1pziMKIzBevHxJYjoVpmyhFuFW9WWbtlDjoNqoWcZySZTkXLi9RrZqUyhayLGOawuYXBRGmbfLBjT36/Tl+lPzU++cTC30Wp+jn12B9HYZD0fEeHSNriujiGy7IMrnng2kKupckkac52TwS9qsjIdaSFhfR19ownwu7mqJgP7SOrCpkfoRcq8BgALKEtrbAdBJhyjJHXsSKZeDIMvNUkN/6cYIpy3STBBWJPM1QJYmjSUhZFd2+IYsOP8+FSl2XJMJMjKpNTcG6vI5h6wLtenCAbOokQ48syej1Bft+0POZzxPcksb8n35TYHbjBEmRkCyT2TdfJToY8uBji3ixQN7m5Ay7c9yKweZPdnlmvUatarC44HD60RVx0AFKrkb4/hb0emR/9AfIssSZf+d5kTNt6einl/Df3ST1QuJpgGxpPP1bz1F74gySrqLYBneO52Q5rFQtJEXm9sAjyEQGwNVzNbIwJs5z1gydrMgLOLVaQitEigCNZ8+jdyrotg5xjFQuE2z10E8tkEcx8eGQZPeIZOQR7Y9EdONGC6NVIusNhB3vU3wlBT3ugcZZwiQmyzNuDncYBqJQG6rOPPaZhHPKuk2cxgwCUfAVSWJ3dkg36BFlEXWrwsXGEpPiYDCPfZ5beZDVcplBMONiY50PB7cZBQElw2A8nlF2be4e9FhcaKBpKp4fYroWveEE09AIo5ggjIgiAXMajqfIsnACGLqOY5tUyyIZzjQ0oijGsU0sQ+fn1h/EdS3mscc7x3dwdB1H09ifzZhNPa4dfsROd0AcJVSqLv/ne988CcwwVQNVVvmDj97hYDZjZblF4IcQZ4ync7ZGI0plh2s37/DwkxexLINarcTlU8skWQq2SrXqcnt4TJTG/LOb3yNNU37rhc+y0qhSNgweX1zln338E272+3x01OUzCyv857/2ZS6fW2O9UmHJdXn3vS2yPGNloUFJ17n+/hZ5mjOfB5zbWCZJUuZ+SL1RJggjNFX8vYamYegaeQ5PXz7LSr2KJEkESYKpGtwZH9O0qgyDKV4ScGe0x+Zol73plOPJjJVSE1NV6flj3uvd+au6Pf+lL1kWHflaRxxwhqMAy1KZzWImk5BWy8ELE0xTRZYlajURZTqchTiOzoe7I/bGEXsDj6sbTRxDPbEmjschl882aHREut2pxTKvXj9kMgmwbQ1/5mO7NlEYUaqWMCyD0A8xHEuMvyWJ2XjGfDRFUWQU9V6mvYSsyEiydFLwoyDCrFWZT8XEoNKocOHyMq5rCBS3HwtCYhgTxyn+XDxfvHlAHMW0V9p87dUdmPXpH/SpV0Re++3tEfW6RbnqiN14OCfPc0aDOUmUsLnZp7S6RhInlBcXqKytMx4HJ+K9vZ0+cRTzg4/7+POAzz+7IXQtQUS1avLG+8di3RGEnF6v8tnPX6ZSFYEzgReyc2ufJPCptWskSca4Jzz1qq5SrpeZTsRUxGk2TlT1paqLbuoYprBAep5wQlSrFlMvEmE213dwbJ2ya1CrWXSPRsiyxHgcUKq6NDtVmp0qtq3z3ke9n37/fNLNpdQr0OnA0ZFQoqcp0fEEZX0ZDIM8SYl2e8QjD+/dLdRWBflv/Lto/+HfRnZ0sbv/8JbApe7uQa0mUtnSVIzw8xx1Y4V4OCc56JMd95BkmfCjHRRVwtQE6U3X7ke73iNxZoiEtnbdJAtjHj1XI0F414Piv8d+9QrjWcywCI+5p1JXZDh65T2crzwLSYL/8T5ZECHpCqNRyMbFBnmasXBlET9JMQ2FyfGcpY6NbGqkXkQymOI+vE7cnWKdW8KQZS44Fk8+tnzy/c6CBF2XxXQDmN8dkN3ZoawotFbLpPOQ3A+IDkbUHl7l8Pd/hKLKaA2XG7/9I6yzC/TvDOkPQja/v0my34V6neMbB6itCmahoHcdse5QJYm6qqIgibzqvRGzLGWSiJVAXdfodn10TWa5KsZj3vt3md8+Qqs75HEMlQrmegNcF2l1Be3yWdSNVeyLizhX12lcWSaLUpSKjexYJP3ZX3jvfFquhmXTthsczI+Jspi96THzKGKlLJT0EhIDf8zOZEjXmyBJEl/d+BIvrD4DwN60x+70iCiNuTncYcltU9JtvDigZpbRFR2tSOO6M95j4Asb0cAvLDKGgeuYaJqKKsuoikwYRdiWYL5LEmysL6KqCo8+fP7EQx7FMUmS8p985UV29o+FAjqMUVVFiPbynH/y7qv82tXHyfKUDw6OyPKc7bH4+y+tLOLHMY+srxDHCSXLZG86pVJ1SbIMWZKYRD5Pr66yMxjxmYVFDFPnwsMb/PyzD5NlOYoi480D2s0qrWqJWr1M1/MIkwjD1Dm11CLLcxHvO5txZXGB//vaT7A0jdVyi//t269wplZjNJzSPRrwe29c485oxEq5zI39Q05X2yfxvrZj0nYcAj9EUiR0Q8MtlNqkOUEQ0a5XaDeqHB4PsUyddqPC3A+43R9wc/cIyzbw4piy4VDWDUq6Q80sca62zqnqMk27ytX2Ko+tLjONPE5Vq1QNh9GnHOMMnDgjpn6MrglVehCI1YVtaziWxngeMZ9HaJrCbBax3nJRVZly2aDbnbN5NEWRJW4fTliq26iqjCxL1OsWlq6iaQp+lPDerT7DoUeaZuzuDE5EdIZlnHSfAOScjOtlWaa9ukCaZrSXGieFPfRDkijhxc+dI45i0iQlCqITVX4URuxsD3jmilhp+n6M50W0Fqp4Uw/TNul256yfapBnObajMx4HKKeuYDomB8cz9vfGzGbh/9PeewVplp73fb+T43e+nDr3dPdM9+SwYTZgd4HdBZEIEJAJMBSzSJasYtksVfFKrJJLtkvyjcqUgxzkMiyXZLFIUCRNWjQYAZLaXQKbF9jJ0z3Tub8cTj7HF29jZFcRoEkWCYjV/6uZqZrzpVPneZ/n+Qc6HZ9TS2W8ioczu8CFq0sAWK5YM0RhRKVRwXItsiyje9AH3aLaEsU3SzP2Dsasnm7wu398F1WVWT4zy6339zAM8Z0risLb7+xw83aHSsVia2tAe7aE5VjIuoFuiMOndKwAy/OcOIopeP+eiOgWXQzLYPv9u6iqSqHkMh1NiYKI6TSmUNAxdZXFGY9qu0oQJqiKzPKMR7lawHE0FhaK9A77ZFnO/FyRYtFg2PuL6ugNA0nTQNOQGzUolTCWW8IFL46JdvtoDY88SpA0BempD5D9zq+Sf/HXsF5+BhQF2THAsogOR+QPHoLnkW9vk/aGIglta0fY0R4NCe4dIWky6TjEtjXSVIybB77oTJu2TnisDQ+Ok9j2u74g8H3iaTxFIc4FOQ8gPfZwliUe5bPLSGz85Evs7U/F51AUOg8HyJUSesNjNI55eLeHbGjo8w3SHNyWh+OozL5wBvvaaRTHwL8jutxBz+fW//k2V/7hj6JrMta5RaI4QzZUzp6rk2U5+bF2v7DaRHZtrqyWIc1RHAOp6BEcDJEbNcIwxTs7g2xonP/JD4FhMPu3P057ucTyk4uoq4vE79+j4GqQ5zzxf/1zdFlid28iViDknL/SRJUlvKUqr97pUtc0DFni9Kkirqsdm2SkbPV8Vk4VUYs2UZSR9KZItg1BIH6zPCe9c59s6yH53r4wR1pYQLl6EcU1SLpjpI2zf/4n1l8zHM2mbtVpOw083WW20OBUqf3IJW8ST2k4FUqmyeF0ygvz1/lnb/8rXtn9Kk/OrlE0bFRZYaEwxyCYcn+wQ8n0eDDcox+MUCWFSRQxU6jSDwIeDIdEaSq6+oL9qECnaUoYJxQLjjCJkYXnfblYoNMZoGsqP339JbyCTZyIkb50PHqXJJj4AVkuiDqyLPFDLz3Ne1+7h62ZKLJwP6vbRdYqFSZjn1t7h9SdAo+1TxMEEVXLouE4/NgTz3OxMUfFKnK/30eRFFRV4bfe/Tr/5cd/BEVVuNCYIU1TlopFLqwvIUsSe50+dc9lpVwmzVM2Ti9QMsXDq2x6pFnGvFfj4KDPuZpgtv/sd3+MK811/v7Ln2N+scVja0s8M3+adw8OcB2L2719fuXv/gMA+v0xQZKg6RrPPXMJXVPRdI2vfuV9ypUCkgTtdhXHMTF1jaPekM3tQ1r1MjXXQZJgNJzQdgsMwwmaojGKJmyPDni/cxdFkhkEI6IsYanYQpYkDiYTztVO4xnf+Tp6RZGpuAZRnFK0RR76ynyJUsmk7BqMp2LPa1ka43HIhZUqd/aHqLLM6bkSmqagyBIzJYvDY9a9aajc2Oo9SohLj4lgQZAw7A7RdeFxrxzzH+IoFn4PeS526lGM5YggHNM2hS48zVlfPZbpHnvZq7p4HUUVMjNZFl1+HMV89hMXCSYBw2lMmufsPjjCcXQkCdJEkABNU2N5xhNGPYDrGjz5gTM8/+wqt772UOzEVZnRKOT1r27xk5+5ICJwNeE0Was5LK+1MKA5AeIAACAASURBVEyDUX+EbetC9qYq1FdPMTga4FU8Gq0ivZ4Y4Yd+yOysh6rKbJyfIY4zrl9sU/BMSmWHRsPB92PiKCGKUj74/GlkRWY8mHC4P0CSJNyii6Io2I7J3a8/wKt4yIqYEhiWQWm2hT/xOdrrUKx6lCru8UEnRpKgOwppNFx8P6E3DHhwMMa2daEwG4RcfWIZw1BJ0oyltsfqmb9geh31OrnvCzOAIBBdeKEgRvWShHXxFJKuoxQshjtD8lf/SJitDIfk0ynYtihAW1sY64tIXgGpXEN+8mkkWTpmvOcig9o2UMs2imMiqTJFTyfKMkxZ5vIzi+iyTL1u0dA1Vo5DZHRJpuhoqCWbP/mFf8soTcWIWpKoqApZnDxy1MvJefKHn8LVFP7kn/4WkyQTev4woLHehEKBwdd3aTYsSkVdaOzLQo4iOwbO2Vn2vnSTvNdn/PUd7NUm4cMuvp/Qatrs/je/yJmPbZCPJ3gFnTzJGHampGmOYunIqkzSnzB89RZWzcVaa6LYIt99Z2cCsoymy2TTCBoNoVjY7wGglhy2X38opG+A97TYm+a/9L9SKxosLhSQNIWComCdmeXCWpnxVhdXVjh7qkgOvHGrR/tUma2xMGZRJAnVNdAqDqapoNZcogcHYJpEu33yvX0RqiNLSMen9uiVN5GcAqTC7Cg/OkRxzb/ww+uvA/NeEz8RARWTeIqr29iaSZhGxGmCoei0nBq2qnF395Dff/AKc4UqiqSQZCltt86D4SHvd2+zUp555Bc/V2gyiX0myZQ0z1GO9e8FXWfeKxOlKZZlEByP2j/42DlkWWJxqU25UuDK5TWSUHTtiiJib3/+l/6liG9Nc9I0w7EN0jzHtS1BVrNM/pNPfRe6rvE//8bvE4YRaZ6S5SlrM0083eGVTREyVfYcuv4EV3OEvjdN2ai2+bWbQjHw2/fe42prjs1Bl4P9LpVqkf/qj7/As2dXOfJHLFTKBGlKdzIlTFMKnsMkjtEVhdd2tnlqbZnLzSVKpsmM2+C9r99HlzV0XcVQdYIkwjNc0jwlzzOemJnlrfsPSbNMjPVn5iibJv/t61/Acx2WltsUDRFT/eGVDc6fO0W/N8LxbNZW5gC4fXebM8uzjKeCjCRJYNkGBV1HVRUKnsPmoE9Bt9ke9Tny+5RNYW/rJyHboyEPhh1m3CamauDqOoNwSM36zpaIAkRRwtaRYJ8fDQOaZQvt2MkuB1oVm6Kjk+c5URjz2rv7dLs+nq3x9u0jmnWH3jDg3uGE9eUKhqbgGCqqKuNHCVGSYZrao3gBt+gShjGGZWCaKv7ER1VV1jbaZGnG0qkabtFlcVkUdfmY/zSdBHzxt94RVrBZThKEmLbJaBQSjKfIx6P9l144g+M5/OKvvy1G7NOIaZRx9uI8WZYTRYKYluc5e9tdVFk+to6VaTUcbt08RD2+lqIqTMcB3YMexUqB/+kL71Btljk6mlKpFphOI8bjEFmRacxU6XVGBNOA0A9ptT2uXl+l0RT3Sf+wj64r2K6NpsqoqkylYNLvTYnSjGbTZTjwmUxiRqOQpWXhvX/34QDHc2jPVZAVGd3UabTLLK40GfbGmI5JuSoIiGmacrRzJKYdcUISJew/OMA0Rby1YSjsHUxQVZkHmx0GgwBVlbl/r0uhoHP75gGaJvga5bJJp+Oz1/NxHf2b3j/futADUrkiyHOGIXztp1OkdkuM4BsN4YgUJZRXalAsCgLe2ppwTIpj0e17njgklErkgy75G19BXpgTCXhAOg7Q6wVkQxPmNUlGGKYstF1cU8FYqFMq6tx9MOIoTrj1YMS8oSNLcOaHnkOZaXL5xRUAWppGQVUYpRn6mSVato4hyZxfLPLVf/EKF3/4adRjKd431gdavUBy4x7lx5ao/70fo9BwUWwdJhOuXG0R7fYJd3rIMoQ7PQrnZkknIfZ6m1MfEh1+pxtw9Oo95E9+BqPpYS7V6PUCKmeaJAOfLMlQKy7mfAX/aIxUKgpXwSji7M//MBgG9XNtMX5IEmE77Oj4v/kl1IrD4g8+j+R5aHN1KJUgz7n7y69hmip3t0YkA5+So4Fl4T62wngck5JTOD/HlfMNrj+7yI0397m6VOKJv/MS63MF7I8/R57lHB4F5FGCfmoW0lS4FdaqxEdD0nHI9MYu2kIT47knyCcjpGoF1bPJHzwk7k++xd3z7cfBVDjEpXmKIinc6T1gGgd4egFNUZnzmmiyRpylXFqcxU9i5gpNlopz9AMxCksyQQRLsoR+OGJndEAvGDJbaPB+5x4zbpFRNGWmUCLNc/YnQwq60MXXyh6qLLNWadJqVLhxcws/iPjae/eo1Iq4tsnf++5PcLa2wHc9eZEkzajWihQ9h/Ek4GprAV3XMA2NSxdX+cJXXufnvue7jyV4JmmWMU184jTl9za/zrX5Wf7Rh38C85jB3wsGXFpbpB8E9IIhcZaxN+nyzPwqlmqwVmnw8avn0WSZOEp47fZ9Xlp8ktVyjdVyjc7RgDPVquAGaBqebrFerfKV+w9o2jU0RSHOYv7xZ3+Esunx+PlVVFl5tBKZxgG/u/VVAH78+gs0nAqnq1Vm3Bols8Br79zGNDTefe8ecRoz26xiqQbX5mcZDie4tsUHVpf54NOXePGZy7z+7h2WFlr87Kc+xtJck5965kVOlWr0uqNjp781ojSh5XrMFhpsDg5QZZXbvW0uNheY96rcHzzEUMRDcX/aYRh9Z/NMQPChJ0HMYWdKpzPFj1I6w4BWxRbZ5IaKoSk0Gi4Li2WqVYvVhRL7PR9FkY51JDANEyZBjKEqbO2O6HSmmJrC/YMR5bJJmmYsL5awbB3HMTAMhTTNqDQrwukuhyRJ2LzfIZgG3Hp/F6foIEkSZ862cVyTFz8sOmqv4mEXCwSTgGrVpjHfQpIl5pab/N6Xb/P4Y4skSUIURAzGIWkOw2HIaBTQbLr86GevYZrHk7cg5vT5JSRJYjASRfvG/R5zSw2qVZszG02WVlskiYjY7R70mZvxaDQczpyqMuxPWTpVY+/hIeVqAcsVI/v9vRGnWgVMU8WyVH7ks49RLVk0mgVkSWI0Crmz1adYsvj6XfEsuXi+zXNXZpAkiapnUqvabN7dR9M19nf7ItRGUymVTEFmNMQk9cxKlUa7wuLqjDALapb57CcvUZup8cSzp4VvfX9MGKZcXKuRphnVusfKcpleT2jx+/0Ap2ChKCJBbzgM6femgo3f/+YrqG9Z6IPffYX8/j2+IbZWL5whHwyRlk6J8Jpul3w0RrZ1jBefFTn1I6GRZDQC2xaEvXYb6doTgmy3t4d08RJSrYbeKqJWCxgbS0hegaP3doV5zeEEXZcf7Z/CzQPKZYPVRY+nz1S5/NgM5aIgb6BpEMe88sVbvPBzn0aTJBxV4cmzNaRWiyjOaFRMai9f5tKLq2z/xutUKyarix7RXg+SGGlhQcRgLiyQb93jYHNAMvQhy7Ae38B95hyKbeD7KeYHrhEfjpBNnWi3j1Qq0t0d0ahbuHWH/N99icl2H2V5gZXrC6glm8yPuL85Qp2poy+2KH/4GmQZcXci+At3boGqotg62kxNHKxkmbgzxrp+nmi3L/7t7BWRJeD7JJ0R212f2ZdFbG4wChlPE/F9ZIJf8OS1GQAUW0f1LNbOVChvtNn813+EaSowGhE+7JIkGfHhSPzfKILVVZIHe2jXLhD3J9jn5o8ti2UkWZgJqa0K0tws1rmlv9wT7K8Yv/q1t+gEXRRJwdZMFrw2k3iKpZposkqYxPSCAbqi8v0bH8HVTMpGiW7Qox9OcDWb9eo8iqTQcmoMwzGObtNya1iqhaHozBYatJwaYRKxORhgaxpbh11sW8TU1uslRtGEeqPMU1fX+eAzl/jwU5coFRwGowk1q4SlGvzKH/wJ//WP/TiOZaLKMlcurnK5scF44nN2fYnvWjvLRy6d5V++9gqz7RrtdpX9aZcgCfjQ0nmqlsX1mfPcGd7jaDxhEIaEacSl5jwfW72IrQm3yY+f+iCH0x5+EhJnwuhnMJ6y3KjSblR4v3uHe/0OG9UVnt5YYa5QxZ+G7Bz1WCnP88zcRT5x/iK6ojONY3rBkNu9TSpmmdOVOhIyWS6uuzc54gNzl9ifjNEVlaeaT/HC/HVu97bZHnXp9IZ8/MlLhFFMlKVMpuJhVTAs5heafPTpyxR0G1WWWSgWuXp+hRcvrPN/vPIqWZYxjqe8sr2JqipMxj7jaIomK2xUl+n6A8qmw5E/wNVNwiSmoNuYx5a352oLOJpFzSp8O2/R/184OBBWsMWiwWQSYagyQZRyOAjwbJ3eJGISJMxUbVZnS8Rxyo17PUxdIQxT0iyn7JmEcYoiyzw4GuP7MVfPNbm8WBLCqrLNXN3l4c6Qzn4f29Y42hd74DzP0QyN8ThCVVVqdY+FlTbXHlsS+QTTgPW5EsP+hC9/6QaXri2jqAqGZYhDgir28fV2lesX21y4PMerr97DKTg05kSQ1CgQITq2rbMxX+L+oSDUlWseu50JpZLJ2lIZw1AJ/ZDnL7fFSkIRZL9KxaJ32BOMflni3maPnZ0h7bLF6fUmjYqNqqrsb3fQdI3ZWY+zG+K1Dw8nhGHKG3c6nG4VWJ4tCi6LLKFpMqapcmmtxmAQ8tY7O7y8WuZTzy6y15nQ6fqQQ3umSDAN0I4ldd/43dozJR67vsJwGuH7McWigWmb9A4H/KsvvM6oN6JRtNjbGxN/I+cBmExiPM9AliR0XUwujg5HmKaKYagM+1PCMGF2rsj29vARufJPw7cs9HqzKLr09DiIo9dDMg2YiCKO4yB5BWRNIb9zG0olojffFyPxOAbHQZ5pCR395l2o10V6XecINB2pXCYdTZEWl8DzaF5fRnniccIwRa8VUBSJctnAOLtM8fppdF3B+48+BFnO7NkGFz56hjv/2++BolAyVNJ33+fCS2tc/JmP0jn0yff2iLOcxe99itf+ly8THQypLVdY+MQVKs+fQ1+eAcMEXRf59KpKdvsujqOit0tiZTEYIK2fQ9JEnG7vl38fteqSxYkYW9s2c597DrfhYp1bhCShsNqAJEF/+hrhTg+9XWIYJ0KZMBiKgqqq6AsNsm6f6Ws3QJJQf/bnoVoVNsOSxGizS3J7S7yXKCJ/6zX8X/oNpGdegDzn6rk6kzfu8th/+nEMW2P9bE3IHXUdfX2Z0cGYdBKiuCbK1Ys4FxfYf3ubxR94juLnXkZqNLEuLDO/XsPemBGe9poGh4dIqox07WmsjQWhFIiEtWMex6TTEFxXHPYqlb/kI+yvFmcaNbJjIqapGozjKWme8eWHX8FUTGpWGVe3qVhFXtt7g/P1Nf7Fe7+Nn4S4mim6/TShbHpMYp+y6eHpwmDFUR0KusPhtIurW6xWFnh+4RTPzz8GQKXooqgKlWqRtfICH1pepWbbXG3NsuCVOXVqhueeusg//r//DbqiUXBsPv/GH9JuV/kvPvuDdI4GxFmCY5v81JMv8j9+8XfZHAxYmWvy6ceu8OkrV1gtzWMoxiOC1CAc8cbeLWEbW29jKDqqrLBeFh38U7ML/Hdv/BtcTcTthmlE0XT5vktPkmQZ69UqnuGyWKzgJwHPL1xia3iEV3TodoZ4eoF3j25Tt8tEacRyUTwof+fuLZIs5sfP/SBFw+V0ZZk0T/n9e3e4P9xm3itjqgav7L/CL3zlV/no8geYcUs89eQ57vR6/MMf+ByqLPPU1fXjOGCT7zl3mXu9HnuTAeu1Bs/OXeal5XN8+cYd/s4LL/EPPvn9LHtzXG7NML/Q5Ox8Gz8JiY5zCPrhiM+sfYwgSdioLjOOJsy4TWRJQpUVCse/Y93+zr6HAYbdIZ6tUbR1HEfnoO9TPta9SxIULE08oscRf/zGNnduCpa4psoUCga2oTL2Y+ZrDoNpxMUl8Zn3ej6bRz5rMx5RnBIeGw9VmyXmWwVs16ZQMHBdQTRbPpaxaZqM5+l4toblmJy7vMhvv7JJseximAadjsho+MSLp4/H/jJxGPPhpxb5jd+5QZrmnFprcuXyLDMzBZxjjf9M08W2dXZ7U2RJwp9GtNsFJEmiXbE503aFPHqtwed/6XWKRYMwFDI034954fkzxGHMwrLo9GdnPcIk4+x8ibtbA4pVD0VVWFwskec5iiwRJhlxnJKmOcNhiCTBT19fYKlRoF53qJYt+v2A3Z5PluU0mh7//JWHfP7X3uMT12YYDgPOXZonDFO+71OXMU2VmcUG6wtlajWbCytVDg4mpGnOzEyBD19qc+HKAnbBZmlthseeWCbLc4pFg+ZsFes4ldC2NRrHE5ufenmZOIpptoqP3muapCRJhq4r1GrOI0XNn4ZvPbrPc9GFf8M+bWYG5ubE3v7omMqvaUhrq+LP4zH65XXwPPA8pEqVbHdfSOqiiOSdG4+kdfn9exAEqJ/6HuFHrKpIp9fI+z1aGw3UqktxtYHR9EQBarVwX7xK+tZ7KJ6FsTKD+oFnWX5uBVSVCz/zMZKhj7qxgrR4isUfeRGiCEsX137i5z4tonZliax73CFrmnjtnR3yNBPe+3lO7fmzYhIRBKSDCUzHaPUi2jOPEwYpyplVMj9CnakhlcvQ62G9dJ30qA/VKuF2TxTC8ZjwcIRs6TzzmYvioOQ64iCk65AkyLUK9nc/B4ZB9vl/Kr4fTYNiEcszIc+RGg2irX04PES2dPCnqJfP4jy+hvPUWaSzl1CLFke7Y1GULQtMk/rjS0iy4AYQBCTdsSDydTpweEg+HEC9jnV2Edk2iXa6Qh1hmoJN/+CueJ+2jX9rl7zbgelUyCuDQFzz3Zt/yUfYXy3O1uaRJYkojbBVi1PFhUc2qHuTQ6IsQpM1LtfPYWsmX917j+cW1mnaNQxV+Mw/GB0gIeEnITvjDrZm4mgWR8ERtmayXl1Bl3Vs1WK5NMcoGnFtaZ66bVNvlJnxCtzsbrJYnGG9OsMo8lFkmafmFvnh8y9weW0RR7X5u9/1Mqv1Kh9YXWbGafHzn/xe0ixhbqaOKiv853/rBwiShKplHe/mRSCIKquESYQsSZRNwSH4vktPPurWt0dCHth261xurNPzfZ5oXSZKY3RZI8syCrrDJ1YfI0gSbNXkwbBHUffI8ozD6ZSW6/LTL71IN+ixXllia7hHnInuw9VsPnvuCeIs4R/9yT/D1VwkJFZKi5RtiyAJWSnNszM64GDaJUpTVFljuTTH584+yXPzq1xtXMBQdL52+wGKJLLqF7wZnl9coWq57I2HTGLBtZAkiaNpn5vd+8iSwnyhybl6nZbr8ur2Fm/vP2R/0mUcRYyiEQteDV3W2RoOHl1DliTSPEWTVW71Hnwb7sw/H5qzFXYOJhiaQhgmaJpC0daJ4pTpMfs+SXOeWCkjyxJr6y0qFYsoyURRtjXCMGEwjTg4mPB7X90+jqqF8JjkZugKg2mEaWq4rsHD/TFz80UhW/MMlpbKjKYRrquz0PZ4+GDANEwI/JDFVoFKxWZjtcqFy3PYttCOVx2NH/3UeY6Opiiqgq7K/MSnLzAahUwmMdMwwbV1/FBEL0dJdjzNFXLC5VNV0jQjSTJef2+fYZBSdg1sSyOYBFxZqtDrTQmCWITAqDLr59pEUYrr6AyHIboiMw4SHEfDNFWeeGKJwUD4v2w+HGLpIitgrl3AtlVkSeKf/MFd3GOmfcHSKBQMRpOQLMtRVZnD7pRq3aNiq9RqNiVXZ3m+iK3LFIsmaZoTxRnGcVDPwjGx7+hoypduHGHqCo4rMkZ298fESUbJM7m80aBZttjtClLgyBdeGlVTZ+Nsi52HXeI4YzgMicKIctmiUbJIkvSb3jvwZxR6ud0URTY5HglPJoKZ/Y0iaZowHoti//RzSLOzoqiHIfn+Afn+HvLGGXFYGAxQyw70+7C1Rd7rwcIC2W/+OnmvIxj+hgF7QnevnFlFu34F8+VnReb5wQHYNtHBEH1ljnw6Zfz5X0FenBfTBVkmPhyJNDXdgJ0dcBzWnpyHIOCdf/JrmIs1zM99GklX8e8dkB8cguOSHvYwZsrk29vCmvfCJSTXFROKgyH5nduEW4fku7tUzjTA9zE/dB3abfJOR0Q06jrKbFPs1l0D6dpTEEUU/9YHkQ0VaXUFymXC2zuiEAcByDLp/vGBaTIRE49OB2n9AgDOZ15EtjQYj5EUmej2A2E6pOmEf/g60vIy2cER+Y13MS+fxivqpA92RLSsppFFCdH+AOviKdL3b5H0J1hLNfHbVavisBaG4jCxvoG+Ooe0vASjEeZHniff28F/6zakKdbZBXF40TSkuVmRbHh4iPZdL/4FH11/PXB0kRefZOkjxztD0anbZezjjv3BaJcojblcP8+11jkKuoOlmry9/1CQONvnOZiKeNqCbjGJfQ6nPUbRhLbT5N/efY1+OESWZCaxz+74EAl4cekyP3zxWb5v44NcbJym4w8EQS3LOFWaQ0LiF175TRa8IoosLEPfuveAa81zxFnM0bSPoRh86solxtGU/+xX/zXn6g1+4sJnKBkFHgy7bI8OKGguqqxyvr7I1nCH7dGIuUIbz3DQFJWDyYRDv8PN7n2G0ZgPLZ3mZv8upytLXG6sE6YRt3qbmKrJqVKbJEuJs4yW3WQYjfnoymUAFgozVM0Kh9MeG9VlJrGPqRr0giFl02MQjbg+s0Ev7DPvztG2W3z/2edIs4xuMODh6IjbvR1mCwVMxeC/f/WLlI0SN7q7PBg/pKDblMoFdieHnK+eJskSDFXnVveQjdosf/zwPXbGRzx/ZgVHtzhdWaIbCsJqzSoy49ZZq1R4cfkCk3jKhxausTfd52ZXHEo2am2KukfdqjLnzrA36fDu4T2em3vs23Jv/nkQxxmeZ9Abh8RxRq/nEyUpRUcnSjIansnN2x32hzGNhotra1RcA0WWUCSJhmfSqjoMpzGlkkmz6QiS3bFuvVHQeev9Q0ajiGrFQpIgjlM0TWG24lApmrywXqdSMPEcnf3uhGLJomBphNOQ3/vDOyzPekRJhmOoHB1NKRR0upOENzf7hGHC6Y02fpTyP/ziGxSLJj/20ikcU6PbF53yYsUiSTMKrsHXbh5x62Gf1RmPWtF6NJbujEJub/apeSaXnljl9Xtd1k/XObdcZTgMub3Vp12xsW2N4ShEliV6k5DhNKJVc4jjjLJjkOdCXXB1o8GXv7qNW3KJ04xqyaI7jWlXbEZBwlOnayzWHE7NFen3AyoVS3izxBmLc55YoQQp7bLNyI/5k9sdZusumibWI8utAtvdCYoscfvWEeWyxb17PSZ+zMyMhyQJmWSS5QxGITc3exi6gmWozDVdTE1hpVVgaxBw88Yh/sSnVrOp1Wx0U6dVc4iSjIODCZdP17/p/fOtO3rThNlZURh8H3xf7N9BFPjJBEyT9K13yd9+nXx7G2lBmOtIK6dEAbEdQdzTNNHpBwG5HyA1GqJAGsJtj2P9L4aB8dzj4PtIikI+GIgOWJaFJexcRUwLHAfn4oJYLagqpCnup18Q1zjcE0VJ19GX20hXH+f047PCy30sDgPWs1eQih5EEZKuwuws6d4hWtUl390WRTcISCch2DbGxhK4LvHhiHxvH0YjpAvXkK48LkyCJhPx+Wwb+1Mvk7/5GvJnfgSpWhfKg9t34OgIyTz+Ho4168nAF8XWccSqIAzhcB9u3yYPQ+TZNlgWqmehL7ZEWmB7CbXqgqqJg87ODgQBhZUGSq1MvnVsPzw/g97whFIiStHqBfIoEd/3zg7pQQeiiGhzHxQFaXEJqT0HmkYeR0hzi5hzFfFbu2LCwXSKVK3C7CxSsUjeOfxLPsL+amGpBi27cbyPjgnTEFMxKZseXX9ALxhQNUu8fvAubx99jZ3xAUW9yNc7t3l+cYNJPBXBE4pG3S6zWGyzNz6iFwwp6A5BGlC1LKpWmY7fp24JFvL5+jISEsveItvjPRRJYRoHpFnGUrGFp3uUTY8PrqyyUT3FJJ6gyio/89xHMRSDJE/RFFH8/SRkrbTCRx6/QC/wGYQDFrwZrs+uM44mIt5WkijoYny3Uavzys6btJwaAP0gwNFsalYZSzW52z/g4WgfUzFoO20+eeojzBYaZHmGqRpkZPzQ2Q/zR7uv8aG5F5hx2mx1+2yNdhhEQ0bRhLJRoqA7zLpNJEnCVm1s1cTVHVRZYRyPuTW4TZYLwyIJiaZTZqnY4NOnP0Cap8xVSqR5ylqlRS8YkmQp87UyZdPjzkBYDRd0m9lCAUPRsY5VJ0XDwVB07vYfcjDposoKd/sHNO0a5+vLLHsLLHht/DRg3p1jtdxkHE9ZKy8yjseEaYilCrVI262gysq34c7886Fet6m4BkVHx/MMoiglTjKOhgFJmjGYxqiqzCvv7tGq2ARRiqUrbD4YsLs/5t7BmGkonNNOzxUZj0VnL8sy/WnEg26AZWkszXgcHk0ZjyOSJOPsfBlFljg/V+Srm326I8GhmK27rCyUCOOMhVMN5hdrWLrKzv6YnYMJc3MetbKNHyWMpzGKIrM2W6RV0Fk53RLhSeOYgqXx3IUWjZKFqkg4pkazKFLrXEdnr+ejyGJHHUUJlq6wfkqsHaJjj/522WKmqPN9zyww03TJc3BdnSTJaNXEemZ9psBM2WYy8hkHQsLm+zFNzyAMYs6crhPFKaamsNebYqgy+33x2jt9n+D40KMoEp5nMBwGXFoocacTMNN0UWSJimvg+zH7vSmapuAeryO2dkYMpzFeUYTVDDoDOp0p6vHnGnRH9IaCWb9594BpkGAbKitNl2rBoDeJ8EyFpeUKzdkab/3BGxwciHyKO5t9eqOQPEdMr78JvmWhT+/ch/198ZB3nEe6eqlUEh3hYEDWG6A0a6JLjWPyrU2ky4+R3bwNQH73jijUIIqYYSA1G9DvIxkGmR+K/b+iiIQhz0PyiuJad+6I/xNF4pBx965w0Ht4LDUr08hwhgAACHBJREFUl5HcgpgyHB0R/uFXyX2ffDT8f+2VZfJ33sRYqGPMlsknI2i1yDe3RDHPUuRGDcnzUBZmhV78eHpBnmOtNkVhdhwYDHCeXCdPUvw/egse3CV/901yPxBFvNlGWl0n9wVHIX/3NfIHm6K4yhLR4RC9XUEyTLH+iGOMZ66K6UEQQJqSHnbBnyCtrSE98byYpiQJWZQgrW+IaMi3/h3q3/6Phevg3h7xbhc0DbVdfUROJM/J9w9QCiYMBqTTEKVSRJ+rkXe6UK2ibJwGXUdxDXJ/Sn6wT37ja1CpCMOH7iHSwjzoOlJ7Bg4PBXkwjoVTYrEobJG/g7E9OmBvekBBF77QpmKiKxo1q0rRLLA/6XA47bJSmqduVUiyhK3RNpcb57jZfUiaZ7zfuYujWUyTgDCJWSrNslZeoB+M8HSPgm6jyRo1q0yeZ8wWmjiaGK/fH22J2NTER5YkBuEIR7e43b+Po1usV5dpOy1czaVsevzyjdeOg3UywQtIpriaxf3RJhu1BRa8GnGW0LAbvN+5z+Pti0hImKpBxSxTsYrsT8bMey26/gBdFgXSUkwc3SLJEzZqswRJwq/ffpWH421eP3xT5LfHE9pOgzOl04Do4Pene/TDPrquUtAdRtGEgu5gKAZHfo+MnHPV0+iKTj8c4SfCZTAnp2k1uFy7TJhGVK0Sg3DE83NPk+QJX3r4Kn//+k/Q8Xs4msUgFA5rZ+uiCBiKWIUMQ6HqGIYTOr6PpRqUTA8/DijoNpfqZ/GTEFWWibKIcTzlrcOvsVCYo6QX2fcP2KiuYKkGjmoLSWTsM4mnpFlK06k8sjf+TsZ0mpCkGXEiPOklSeJwEGDpKqaucG9/RKViPdrLVwuGcA281GYyiTjoTnm4M6To6EzDhG5XdIYLTZeCqVFxhTa/NwkplUwcR6dQ0OlNQnrjkFduHfFwb8RCXUgRdztTjGPjHs8zqJYt5iumsNVVJXq9QFhsFwzWZotYlsbIj7nX8fEKBo2iRW+asFi12DyaslR3sDWZkq2zXLcoujphlDIYh+wcjakWTTRNwdDEemHkxzTrDnfv93jl7V3uH/n8739wn+29Mf1JRLNk0ao7SMDNzT7dScJe3xdJcYho5GLRJM1y4fKnKVxerqBrCmM/5vbukN4wQJMlyo7BaquAriu0SzZbW31OLZW535nyxzePuLIkDNEkCWF4Y+vMz3jMVhyx/e5MeOeNrePduoiytW0dz9aJ44z5pRrnFoWHv+VYTMOE+9sDvnKnw2zZxNQUbh8FLDYKhGHCmScvEARibTYc+ERRQrFoohybKv1p+JaFPumMYWlJPNQLBcGon05hQbDupY9+D8qVS4+6fVxXGOI8vC888h1HsLXjYyZhmgpJXaEAjkP2zrvI7SaSqorxexwJDf7DB0Jj74s9MIbQm2eDEVgW+bHXN5omil2WkRz0UAqWKPyOIwpTkiCd2YBOh3Q4QSqXxCFhNEIyDdKv33z0vrJ33kWqCkc8RuJ1KJeRdVX8fX9f+AJs7iLPtLA+8ixkOVK5jHzurCAZbt4T04ReD8Zj8js3xcFFV5FmZ9HnG2QTn3x/j3Tgi/XFdEqepo8+j2zp5IcHZO++B2+/irQsOAhKqw6KRv72G+JQdONtobXvHx+iXBcWFgDIj44gSYgOR0jrZ4g291Eck/SoT3I4EJ/R90VnPxyiFGxxMNneQbr6OBwdkScJ+XgsiJiTCfnRoThEjEZItkN8d5v47ffB8f4cj6u/fvTDEbNOi1E0xVAMwjSkHw6oGMKv+hOnXuZMZYVhNEaWZDzDRZYktkYPKRo2lmrgGS5JlpJmmdjj795ClVU8w+X1/Xc5XVkmzoSn/jAa0/H73B9so0gKw3CMJmsUNBdTNWg5NSaRjx8HGIpOkIb4iU+YhuyOD1mrVDAVE08vUDZK9MMh19tXCZKQaRxQsYpMkgn7031kSeJG9y5hGjIIR7x58DVczcFSRVEumQXCNGTe83jr8H1G0QRbtXlzf4tFr8lzC2dxNZeKWeZ8dQNTMRjFQlJ4s38XQzHohX0KmstSqUTDqlG3KkiSRCfoMoqmDKMRh/4RiiQTppEIl1JURtGInckeb3Xe4kJ1nZyclfI8YRryhRt/xIxb5/7oPpIkcbu3TZyltJwaVavE3vHhS0JiGI651jrLwbTPqVKVw+mAcTQlPg658ZMpk9hHk2V0WWcYTniq/TjjeIyhGMRZjCar7I2PuDd8QMUssTM+ZNadYXvU495gB0e1v8136Z+NIIgpOTqjIEbTZKbTiKOjiciRn8ZcX6sy1xBF+P7ukDDJ+Mp7e+x0p5RK1rHjmoEqS4RxRq1mc/vGPlv7Y1xT4817XRbqLuNxRKczRddFtshwGv9/Crqlq7imxmzNIUkzpmGCogjLZj/KKNg6k0mMYSjYhoqpSmy0bLrdKQVL42gY8ODhgJKjczDwmUQZg2nE+ztD9kYxuiZzc2+CZ+t0u0KvHoYp2/tjZFni/Yd9bEMlTjI6PZ9m0+XyRgNNlWnWHb732QWyXMTujqYxqiJzaa1GmuecajikiZgCLC2VmGu43DkYM7dYRZUlbu+NjrNNREcdxxk9P+H23pDdns/5U1UyckxTo+qZ3N8doikyFUvF0RX2usJNMMtzVFni6w969CbhIyvcxfkivh9TbVUJgoTe5N9H1R4MfA4OxpiWTtUzGY8j1udK3D+aUrY1hkFKsyhIkWmaEwYxwTRgYbHMaBQJHob5zTt66Vsx9U5wghOc4AQnOMF/2PgzDXNOcIITnOAEJzjBf7g4KfQnOMEJTnCCE/wNxkmhP8EJTnCCE5zgbzBOCv0JTnCCE5zgBH+DcVLoT3CCE5zgBCf4G4yTQn+CE5zgBCc4wd9g/D8bhgzDKLnQzgAAAABJRU5ErkJggg==\n", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - } - ], - "source": [ - "im = image2tensor(Image.open('images/grizzly.jpg'))\n", - "_,axs = subplots(1,3)\n", - "for bear,ax,color in zip(im,axs,('Reds','Greens','Blues')):\n", - " show_image(255-bear, ax=ax, cmap=color)" + "Make sure that you understand *why* these are the shapes for our mini-batches." ] }, { @@ -2056,28 +1750,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As you can see, we haven't had to use a separate *image regression* application; all we've had to do is label the data, and tell fastai what kind of data the independent and dependent variables represent." + "As you can see, we haven't had to use a separate *image regression* application; all we've had to do is label the data, and tell fastai what kinds of data the independent and dependent variables represent." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "It's the same for creating our `Learner`. We will use the same function as before, this time with just a new parameter and we will ready to train our model." + "It's the same for creating our `Learner`. We will use the same function as before, with one new parameter, and we will be ready to train our model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Training a model" + "### Training a Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "As usual we can use `cnn_learner` to create our `Learner`. Remember way back in <> how we used `y_range` to tell fastai the range of our targets? We'll do the same here; coordinates in fastai and PyTorch are always rescaled between -1 and +1." + "As usual, we can use `cnn_learner` to create our `Learner`. Remember way back in <> how we used `y_range` to tell fastai the range of our targets? We'll do the same here (coordinates in fastai and PyTorch are always rescaled between -1 and +1):" ] }, { @@ -2109,7 +1803,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This is set as the final layer of the model, if `y_range` is defined. Take a moment to think about what this function does, and why it forces the model to output activations in the range `(low,high)`.\n", + "This is set as the final layer of the model, if `y_range` is defined. Take a moment to think about what this function does, and why it forces the model to output activations in the range `(lo,hi)`.\n", "\n", "Here's what it looks like:" ] @@ -2167,11 +1861,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This makes sense, since when coordinates are used as dependent variable, most of the time we're likely to be trying to predict something as close as possible; that's basically what `MSELoss` (mean-squared error loss) does. If you want to use a different loss function, you can pass it to `cnn_learner` using the `loss_func` parameter.\n", + "This makes sense, since when coordinates are used as the dependent variable, most of the time we're likely to be trying to predict something as close as possible; that's basically what `MSELoss` (mean squared error loss) does. If you want to use a different loss function, you can pass it to `cnn_learner` using the `loss_func` parameter.\n", "\n", "Note also that we didn't specify any metrics. That's because the MSE is already a useful metric for this task (although it's probably more interpretable after we take the square root). \n", "\n", - "We can pick a good learning rate with the Learning Rate Finder:" + "We can pick a good learning rate with the learning rate finder:" ] }, { @@ -2220,7 +1914,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We'll try an LR of `1e-2`:" + "We'll try an LR of 2e-2:" ] }, { @@ -2308,7 +2002,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Generally when we run this we get a loss of around `0.0001`, which corresponds to an average coordinate prediction error of:" + "Generally when we run this we get a loss of around 0.0001, which corresponds to an average coordinate prediction error of:" ] }, { @@ -2335,7 +2029,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This sounds very accurate! But most importantly, we should have a *look* at our results with `Learner.show_results`. The left side is actual (*ground truth*) and the right side are our model's predictions. " + "This sounds very accurate! But it's important to take a look at our results with `Learner.show_results`. The left side are the actual (*ground truth*) coordinates and the right side are our model's predictions:" ] }, { @@ -2388,9 +2082,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In problems that are at first glance completely different (single-label classification, multi-label classification and regression) we end up using the same model with just different numbers of outputs. The different directions of those trainings is determined by the loss function, which is the one thing that changes. That's why it's important to double-check your are using the right loss function for your problem.\n", + "In problems that are at first glance completely different (single-label classification, multi-label classification, and regression), we end up using the same model with just different numbers of outputs. The loss function is the one thing that changes, which is why it's important to double-check that you are using the right loss function for your problem.\n", "\n", - "In fastai, the library will automatically try to pick the right one from the data you built, but if you are using pure PyTorch to build your `DataLoader`s, make sure you think hard when you have to decide on your loss function, and remember that you most probably want\n", + "fastai will automatically try to pick the right one from the data you built, but if you are using pure PyTorch to build your `DataLoader`s, make sure you think hard when you have to decide on your about your choice of loss function, and remember that you most probably want:\n", "\n", "- `nn.CrossEntropyLoss` for single-label classification\n", "- `nn.BCEWithLogitsLoss` for multi-label classification\n", @@ -2408,21 +2102,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "1. how could multi-label classification improve the usability of the bear classifier?\n", + "1. How could multi-label classification improve the usability of the bear classifier?\n", "1. How do we encode the dependent variable in a multi-label classification problem?\n", "1. How do you access the rows and columns of a DataFrame as if it was a matrix?\n", "1. How do you get a column by name from a DataFrame?\n", - "1. What is the difference between a dataset and DataLoader?\n", - "1. What does a Datasets object normally contain?\n", - "1. What does a DataLoaders object normally contain?\n", - "1. What does lambda do in Python?\n", - "1. What are the methods to customise how the independent and dependent variables are created with the data block API?\n", + "1. What is the difference between a `Dataset` and `DataLoader`?\n", + "1. What does a `Datasets` object normally contain?\n", + "1. What does a `DataLoaders` object normally contain?\n", + "1. What does `lambda` do in Python?\n", + "1. What are the methods to customize how the independent and dependent variables are created with the data block API?\n", "1. Why is softmax not an appropriate output activation function when using a one hot encoded target?\n", - "1. Why is nll_loss not an appropriate loss function when using a one hot encoded target?\n", + "1. Why is `nll_loss` not an appropriate loss function when using a one-hot-encoded target?\n", "1. What is the difference between `nn.BCELoss` and `nn.BCEWithLogitsLoss`?\n", "1. Why can't we use regular accuracy in a multi-label problem?\n", - "1. When is it okay to tune an hyper-parameter on the validation set?\n", - "1. How is `y_range` implemented in fastai? (See if you can implement it yourself and test it without peaking!)\n", + "1. When is it okay to tune a hyperparameter on the validation set?\n", + "1. How is `y_range` implemented in fastai? (See if you can implement it yourself and test it without peeking!)\n", "1. What is a regression problem? What loss function should you use for such a problem?\n", "1. What do you need to do to make sure the fastai library applies the same data augmentation to your inputs images and your target point coordinates?" ] @@ -2431,15 +2125,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "1. Read a tutorial about pandas DataFrames and experiment with a few methods that look interesting to you. Have a look at the book website for recommended tutorials.\n", - "1. Retrain the bear classifier using multi-label classification. See if you can make it work effectively with images that don't contain any bears, including showing that information in the web application. Try an image with two different kinds of bears. Check whether the accuracy on the single label dataset is impacted using multi-label classification." + "1. Read a tutorial about Pandas DataFrames and experiment with a few methods that look interesting to you. See the book's website for recommended tutorials.\n", + "1. Retrain the bear classifier using multi-label classification. See if you can make it work effectively with images that don't contain any bears, including showing that information in the web application. Try an image with two different kinds of bears. Check whether the accuracy on the single-label dataset is impacted using multi-label classification." ] }, { diff --git a/07_sizing_and_tta.ipynb b/07_sizing_and_tta.ipynb index f59f101ac..10493b772 100644 --- a/07_sizing_and_tta.ipynb +++ b/07_sizing_and_tta.ipynb @@ -21,18 +21,18 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Training a state-of-the-art model" + "# Training a State-of-the-Art Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This chapter introduces more advanced techniques for training an image classification model and get state-of-the-art results. You can skip it if you want to learn more about other applications of deep learning and come back to it later--nothing in this chapter will be assumed in later chapters.\n", + "This chapter introduces more advanced techniques for training an image classification model and getting state-of-the-art results. You can skip it if you want to learn more about other applications of deep learning and come back to it later--knowledge of this material will not be assumed in later chapters.\n", "\n", - "We will look at powerful data augmentation techniques, the *progressive resizing* approach and test time augmentation. To show all of this, we are going to train a model from scratch (not transfer learning) using a subset of ImageNet called [Imagenette](https://github.com/fastai/imagenette). It contains ten very different categories from the original ImageNet dataset, making for quicker training when we want to experiment.\n", + "We will look at what normalization is, a powerful data augmentation technique called mixup, the progressive resizing approach and test time augmentation. To show all of this, we are going to train a model from scratch (not using transfer learning) using a subset of ImageNet called [Imagenette](https://github.com/fastai/imagenette). It contains a subset of 10 very different categories from the original ImageNet dataset, making for quicker training when we want to experiment.\n", "\n", - "This is going to be much harder to do well than our previous datasets because we're using full-size, full-color images, which are photos of objects of different sizes, in different orientations, in different lighting, and so forth... So in this chapter we're going to introduce some important techniques for getting the most out of your dataset, especially when you're training from scratch, or transfer learning to a very different kind of dataset to what the pretrained model used." + "This is going to be much harder to do well than with our previous datasets because we're using full-size, full-color images, which are photos of objects of different sizes, in different orientations, in different lighting, and so forth. So, in this chapter we're going to introduce some important techniques for getting the most out of your dataset, especially when you're training from scratch, or using transfer learning to train a model on a very different kind of dataset than the pretrained model used." ] }, { @@ -48,17 +48,17 @@ "source": [ "When fast.ai first started there were three main datasets that people used for building and testing computer vision models:\n", "\n", - "- *ImageNet*: 1.3 million images of various sizes around 500 pixels across, in 1000 categories, which took a few days to train\n", - "- *MNIST*: 50,000 28x28 pixel greyscale handwritten digits\n", - "- *CIFAR10*: 60,000 32x32 colour images in 10 classes\n", + "- ImageNet:: 1.3 million images of various sizes around 500 pixels across, in 1,000 categories, which took a few days to train\n", + "- MNIST:: 50,000 28\\*28-pixel grayscale handwritten digits\n", + "- CIFAR10:: 60,000 32\\*32-pixel color images in 10 classes\n", "\n", - "The problem is that the small datasets didn't actually generalise effectively to the large ImageNet dataset. The approaches that worked well on ImageNet generally had to be developed and trained on ImageNet. This led to many people believing that only researchers with access to giant computing resources could effectively contribute to developing image classification algorithms.\n", + "The problem was that the smaller datasets didn't actually generalize effectively to the large ImageNet dataset. The approaches that worked well on ImageNet generally had to be developed and trained on ImageNet. This led to many people believing that only researchers with access to giant computing resources could effectively contribute to developing image classification algorithms.\n", "\n", - "We thought that seemed very unlikely to be true. We had never actually seen a study that showed that ImageNet happen to be exactly the right size, and that other datasets could not be developed which would provide useful insights. So we thought we would try to create a new dataset which researchers could test their algorithms on quickly and cheaply, but which would also provide insights likely to work on the full ImageNet dataset.\n", + "We thought that seemed very unlikely to be true. We had never actually seen a study that showed that ImageNet happen to be exactly the right size, and that other datasets could not be developed which would provide useful insights. So we thought we would try to create a new dataset that researchers could test their algorithms on quickly and cheaply, but which would also provide insights likely to work on the full ImageNet dataset.\n", "\n", - "About three hours later we had created Imagenette. We selected 10 classes from the full ImageNet which look very different to each other. We hope that it would be possible to create a classifier that worked to recognise these classes quickly and cheaply. When we tried it out, we discovered we were right. We then tried out a few algorithmic tweaks to see how they impacted Imagenette, found some which worked pretty well, and tested them on ImageNet as well — we were very pleased to find that our tweaks worked well on ImageNet too!\n", + "About three hours later we had created Imagenette. We selected 10 classes from the full ImageNet that looked very different from one another. As we had hopep, we were able to quickly and cheaply create a classifier capable of recognizing these classes. We then tried out a few algorithmic tweaks to see how they impacted Imagenette. We found some that worked pretty well, and tested them on ImageNet as well—and we were very pleased to find that our tweaks worked well on ImageNet too!\n", "\n", - "There is an important message here: the dataset you get given is not necessarily the dataset you want; it's particularly unlikely to be the dataset that you want to do your development and prototyping in. You should aim to have an iteration speed of no more than a couple of minutes — that is, when you come up with a new idea you want to try out, you should be able to train a model and see how it goes within a couple of minutes. If it's taking longer to do an experiment, think about how you could cut down your dataset, or simplify your model, to improve your experimentation speed. The more experiments you can do, the better!\n", + "There is an important message here: the dataset you get given is not necessarily the dataset you want. It's particularly unlikely to be the dataset that you want to do your development and prototyping in. You should aim to have an iteration speed of no more than a couple of minutes—that is, when you come up with a new idea you want to try out, you should be able to train a model and see how it goes within a couple of minutes. If it's taking longer to do an experiment, think about how you could cut down your dataset, or simplify your model, to improve your experimentation speed. The more experiments you can do, the better!\n", "\n", "Let's get started with this dataset:" ] @@ -77,7 +77,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "First we'll get our dataset into a `DataLoaders` object, using the *presizing* trick we saw in <>:" + "First we'll get our dataset into a `DataLoaders` object, using the *presizing* trick introduced in <>:" ] }, { @@ -98,7 +98,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - " ...and do a training that will serve as a baseline:" + "and do a training run that will serve as a baseline:" ] }, { @@ -176,7 +176,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "That's a good baseline, since we are not using a pretrained model, but we can do better. When working with models that are being trained from scratch, or fine-tuned to a very different dataset to that used for the pretraining, there are additional techniques that are really important. In the rest of the chapter we'll consider some of the key approaches you'll want to be familiar with. The first one is normalizing your data." + "That's a good baseline, since we are not using a pretrained model, but we can do better. When working with models that are being trained from scratch, or fine-tuned to a very different dataset than the one used for the pretraining, there are some additional techniques that are really important. In the rest of the chapter we'll consider some of the key approaches you'll want to be familiar with. The first one is *normalizing* your data." ] }, { @@ -190,7 +190,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "When training a model, it helps if your input data is normalized, that is, as a mean of 0 and a standard deviation of 1. But most images and computer vision libraries will use values between 0 and 255 for pixels, or between 0 and 1; in either case, your data is not going to have a mean of zero and a standard deviation of one.\n", + "When training a model, it helps if your input data is normalized--that is, has a mean of 0 and a standard deviation of 1. But most images and computer vision libraries use values between 0 and 255 for pixels, or between 0 and 1; in either case, your data is not going to have a mean of 0 and a standard deviation of 1.\n", "\n", "Let's grab a batch of our data and look at those values, by averaging over all axes except for the channel axis, which is axis 1:" ] @@ -221,9 +221,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we expected, its mean and standard deviation is not very close to the desired values of zero and one. This is easy to do in fastai by adding the `Normalize` transform. This acts on a whole mini batch at once, so you can add it to the `batch_tfms` section of your data block. You need to pass to this transform the mean and standard deviation that you want to use; fastai comes with the standard ImageNet mean and standard deviation already defined. (If you do not pass any statistics to the Normalize transform, fastai will automatically calculate them from a single batch of your data.)\n", + "As we expected, the mean and standard deviation are not very close to the desired values. Fortunately, normalizing the data is easy to do in fastai by adding the `Normalize` transform. This acts on a whole mini-batch at once, so you can add it to the `batch_tfms` section of your data block. You need to pass to this transform the mean and standard deviation that you want to use; fastai comes with the standard ImageNet mean and standard deviation already defined. (If you do not pass any statistics to the `Normalize` transform, fastai will automatically calculate them from a single batch of your data.)\n", "\n", - "Let's add this transform (using `imagenet_stats` as Imagenette is a subset of ImageNet) and have a look at one batch now:" + "Let's add this transform (using `imagenet_stats` as Imagenette is a subset of ImageNet) and take a look at one batch now:" ] }, { @@ -277,7 +277,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's check how normalization helps training our model here:" + "Let's check how what effet this had on training our model:" ] }, { @@ -355,27 +355,27 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Although it only helped a little here, normalization becomes especially important when using pretrained models. The pretrained model only knows how to work with data of the type that it has seen before. If the average pixel was zero in the data it was trained with, but your data has zero as the minimum possible value of a pixel, then the model is going to be seeing something very different to what is intended! \n", + "Although it only helped a little here, normalization becomes especially important when using pretrained models. The pretrained model only knows how to work with data of the type that it has seen before. If the average pixel value was 0 in the data it was trained with, but your data has 0 as the minimum possible value of a pixel, then the model is going to be seeing something very different to what is intended! \n", "\n", "This means that when you distribute a model, you need to also distribute the statistics used for normalization, since anyone using it for inference, or transfer learning, will need to use the same statistics. By the same token, if you're using a model that someone else has trained, make sure you find out what normalization statistics they used, and match them.\n", "\n", - "We didn't have to handle normalization in previous chapters because when using a pretrained model through `cnn_learner`, the fastai library automatically adds the proper `Normalize` transform; the model has been pretrained with certain statistics in `Normalize` (usually coming from the ImageNet dataset), so the library can fill those for you. Note that this only applies with pretrained models, which is why we need to add it manually here, when training from scratch.\n", + "We didn't have to handle normalization in previous chapters because when using a pretrained model through `cnn_learner`, the fastai library automatically adds the proper `Normalize` transform; the model has been pretrained with certain statistics in `Normalize` (usually coming from the ImageNet dataset), so the library can fill those in for you. Note that this only applies with pretrained models, which is why we need to add this information manually here, when training from scratch.\n", "\n", - "All our training up until now have been done at size 224. We could have begun training at a smaller size before going to that. This is called *progressive resizing*." + "All our training up until now has been done at size 224. We could have begun training at a smaller size before going to that. This is called *progressive resizing*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Progressive resizing" + "## Progressive Resizing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "When fast.ai and its team of students [won the DAWNBench competition](https://www.theverge.com/2018/5/7/17316010/fast-ai-speed-test-stanford-dawnbench-google-intel), one of the most important innovations was something very simple: start training using small images, and end training using large images. By spending most of the epochs training with small images, training completed much faster. By completing training using large images, the final accuracy was much higher. We call this approach *progressive resizing*." + "When fast.ai and its team of students [won the DAWNBench competition](https://www.theverge.com/2018/5/7/17316010/fast-ai-speed-test-stanford-dawnbench-google-intel) in 2018, one of the most important innovations was something very simple: start training using small images, and end training using large images. Spending most of the epochs training with small images, helps training complete much faster. Completing training using large images makes the final accuracy much higher. We call this approach *progressive resizing*." ] }, { @@ -389,15 +389,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we have seen, the kinds of features that are learned by convolutional neural networks are not in any way specific to the size of the image — early layers find things like edges and gradients, and later layers may find things like noses and sunsets. So, when we change image size in the middle of training, it doesn't mean that we have to find totally different parameters for our model.\n", + "As we have seen, the kinds of features that are learned by convolutional neural networks are not in any way specific to the size of the image—early layers find things like edges and gradients, and later layers may find things like noses and sunsets. So, when we change image size in the middle of training, it doesn't mean that we have to find totally different parameters for our model.\n", "\n", - "But clearly there are some differences between small images and big ones, so we shouldn't expect our model to continue working exactly as well, with no changes at all. Does this remind you of something? When we developed this idea, it reminded us of transfer learning! We are trying to get our model to learn to do something a little bit different to what it has learned to do before. Therefore, we should be able to use the `fine_tune` method after we resize our images.\n", + "But clearly there are some differences between small images and big ones, so we shouldn't expect our model to continue working exactly as well, with no changes at all. Does this remind you of something? When we developed this idea, it reminded us of transfer learning! We are trying to get our model to learn to do something a little bit different from what it has learned to do before. Therefore, we should be able to use the `fine_tune` method after we resize our images.\n", "\n", - "There is an additional benefit to progressive resizing: it is another form of data augmentation. Therefore, you should expect to see better generalisation of your models that are trained with progressive resizing.\n", + "There is an additional benefit to progressive resizing: it is another form of data augmentation. Therefore, you should expect to see better generalization of your models that are trained with progressive resizing.\n", "\n", - "To implement progressive resizing it is most convenient if you first create a `get_dls` function which takes an image size and a batch size, and returns your `DataLoaders`:\n", + "To implement progressive resizing it is most convenient if you first create a `get_dls` function which takes an image size and a batch size as we did in the section before, and returns your `DataLoaders`:\n", "\n", - "Now you can create your `DataLoaders` with a small size, and `fit_one_cycle` in the usual way, for a few less epochs than you might otherwise do:" + "Now you can create your `DataLoaders` with a small size and use `fit_one_cycle` in the usual way, training for a few less epochs than you might otherwise do:" ] }, { @@ -469,7 +469,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Then you can replace the DataLoaders inside the Learner, and `fine_tune`:" + "Then you can replace the `DataLoaders` inside the `Learner`, and fine-tune:" ] }, { @@ -581,47 +581,47 @@ "\n", "You can repeat the process of increasing size and training more epochs as many times as you like, for as big an image as you wish--but of course, you will not get any benefit by using an image size larger than the size of your images on disk.\n", "\n", - "Note that for transfer learning, progressive resizing may actually hurt performance. This would happen if your pretrained model was quite similar to your transfer learning task and dataset, and was trained on similar sized images, so the weights don't need to be changed much. In that case, training on smaller images may damage the pretrained weights.\n", + "Note that for transfer learning, progressive resizing may actually hurt performance. This is most likely to happen if your pretrained model was quite similar to your transfer learning task and dataset and was trained on similar-sized images, so the weights don't need to be changed much. In that case, training on smaller images may damage the pretrained weights.\n", "\n", - "On the other hand, if the transfer learning task is going to be on images that are of different sizes, shapes, or style to those used in the pretraining tasks, progressive resizing will probably help. As always, the answer to \"does it help?\" is \"try it!\".\n", + "On the other hand, if the transfer learning task is going to use images that are of different sizes, shapes, or styles than those used in the pretraining task, progressive resizing will probably help. As always, the answer to \"Will it help?\" is \"Try it!\"\n", "\n", - "Another thing we could try is applying data augmentation to the validation set: up until now, we have only applied it on the training set and the validation set always gets the same images. But maybe we could try to make predictions for a few augmented versions of the validation set and average them. This is called *test time augmentation*." + "Another thing we could try is applying data augmentation to the validation set. Up until now, we have only applied it on the training set; the validation set always gets the same images. But maybe we could try to make predictions for a few augmented versions of the validation set and average them. We'll consider this approach next." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Test time augmentation" + "## Test Time Augmentation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We have been using random cropping as a way to get some useful data augmentation, which leads to better generalisation, and results in a need for less training data. When we use random cropping, fastai will automatically use centre-cropping for the validation set — that is, it will select the largest square area it can in the centre of the image, such that it does not go past the image edges.\n", + "We have been using random cropping as a way to get some useful data augmentation, which leads to better generalization, and results in a need for less training data. When we use random cropping, fastai will automatically use center cropping for the validation set—that is, it will select the largest square area it can in the center of the image, without going past the image's edges.\n", "\n", - "This can often be problematic. For instance, in a multi-label dataset sometimes there are small objects towards the edges of an image; these could be entirely cropped out by the centre cropping. Even for datasets such as the pet breed classification data we're working on now, it's possible that some critical feature necessary for identifying the correct breed, such as the colour of the nose, could be cropped out.\n", + "This can often be problematic. For instance, in a multi-label dataset sometimes there are small objects toward the edges of an image; these could be entirely cropped out by center cropping. Even for problems such as our pet breed classification example, it's possible that some critical feature necessary for identifying the correct breed, such as the color of the nose, could be cropped out.\n", "\n", - "One solution to this is to avoid random cropping entirely. Instead, we could simply squish or stretch the rectangular images to fit into a square space. But then we miss out on a very useful data augmentation, and we also make the image recognition more difficult for our model, because it has to learn how to recognise squished and squeezed images, rather than just correctly proportioned images.\n", + "One solution to this problem is to avoid random cropping entirely. Instead, we could simply squish or stretch the rectangular images to fit into a square space. But then we miss out on a very useful data augmentation, and we also make the image recognition more difficult for our model, because it has to learn how to recognize squished and squeezed images, rather than just correctly proportioned images.\n", "\n", - "Another solution is to not just centre crop for validation, but instead to select a number of areas to crop from the original rectangular image, pass each of them through our model, and take the maximum or average of the predictions. In fact, we could do this not just for different crops, but for different values across all of our test time augmentation parameters. This is known as *test time augmentation* (TTA)." + "Another solution is to not just center crop for validation, but instead to select a number of areas to crop from the original rectangular image, pass each of them through our model, and take the maximum or average of the predictions. In fact, we could do this not just for different crops, but for different values across all of our test time augmentation parameters. This is known as *test time augmentation* (TTA)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "> jargon: test time augmentation (TTA): during inference or validation, creating multiple versions of each image, using data augmentation, and then taking the average or maximum of the predictions for each augmented version of the image" + "> jargon: test time augmentation (TTA): During inference or validation, creating multiple versions of each image, using data augmentation, and then taking the average or maximum of the predictions for each augmented version of the image." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Depending on the dataset, test time augmentation can result in dramatic improvements in accuracy. It does not change the time required to train at all, but will increase the amount of time for validation or inference by the number of test time augmented images requested. By default, fastai will use the unaugmented centre crop image, plus four randomly augmented images.\n", + "Depending on the dataset, test time augmentation can result in dramatic improvements in accuracy. It does not change the time required to train at all, but will increase the amount of time required for validation or inference by the number of test-time-augmented images requested. By default, fastai will use the unaugmented center crop image plus four randomly augmented images.\n", "\n", - "You can pass any DataLoader to fastai's `tta` method; by default, it will use your validation set:" + "You can pass any `DataLoader` to fastai's `tta` method; by default, it will use your validation set:" ] }, { @@ -699,9 +699,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As we can see, using TTA gives us good a boost of performance, with no additional training required. However, it does make inference slower--if you're averaging 5 images for TTA, inference will be 5x slower.\n", + "As we can see, using TTA gives us good a boost in performance, with no additional training required. However, it does make inference slower--if you're averaging five images for TTA, inference will be five times slower.\n", "\n", - "Data augmentation helps train better models as we saw. Let's now focus on a new data augmentation technique called *Mixup*." + "We've seen examples of how data augmentation helps train better models. Let's now focus on a new data augmentation technique called *Mixup*." ] }, { @@ -715,16 +715,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Mixup, introduced in the 2017 paper [mixup: Beyond Empirical Risk Minimization](https://arxiv.org/abs/1710.09412), is a very powerful data augmentation technique which can provide dramatically higher accuracy, especially when you don't have much data, and don't have a pretrained model that was trained on data similar to your dataset. The paper explains: \"While data augmentation consistently leads to improved generalization, the procedure is dataset-dependent, and thus requires the use of expert knowledge.\" For instance, it's common to flip images as part of data augmentation, but should you flip only horizontally, or also vertically? The answer is that it depends on your dataset. In addition, if flipping (for instance) doesn't provide enough data augmentation for you, you can't \"flip more\". It's helpful to have data augmentation techniques where you can \"dial up\" or \"dial down\" the amount of data augmentation, to see what works best for you.\n", + "Mixup, introduced in the 2017 paper [\"*mixup*: Beyond Empirical Risk Minimization\"](https://arxiv.org/abs/1710.09412) byHongyi Zhang et al., is a very powerful data augmentation technique that can provide dramatically higher accuracy, especially when you don't have much data and don't have a pretrained model that was trained on data similar to your dataset. The paper explains: \"While data augmentation consistently leads to improved generalization, the procedure is dataset-dependent, and thus requires the use of expert knowledge.\" For instance, it's common to flip images as part of data augmentation, but should you flip only horizontally, or also vertically? The answer is that it depends on your dataset. In addition, if flipping (for instance) doesn't provide enough data augmentation for you, you can't \"flip more.\" It's helpful to have data augmentation techniques where you can \"dial up\" or \"dial down\" the amount of change, to see what works best for you.\n", "\n", "Mixup works as follows, for each image:\n", "\n", - "1. Select another image from your dataset at random\n", - "1. Pick a weight at random\n", - "1. Take a weighted average (using the weight from step 2) of the selected image with your image; this will be your independent variable\n", - "1. Take a weighted average (with the same weight) of this image's labels with your image's labels; this will be your dependent variable\n", + "1. Select another image from your dataset at random.\n", + "1. Pick a weight at random.\n", + "1. Take a weighted average (using the weight from step 2) of the selected image with your image; this will be your independent variable.\n", + "1. Take a weighted average (with the same weight) of this image's labels with your image's labels; this will be your dependent variable.\n", "\n", - "In pseudo-code, we're doing (where `t` is the weight for our weighted average):\n", + "In pseudocode, we're doing this (where `t` is the weight for our weighted average):\n", "\n", "```\n", "image2,target2 = dataset[randint(0,len(dataset)]\n", @@ -733,7 +733,7 @@ "new_target = t * target1 + (1-t) * target2\n", "```\n", "\n", - "For this to work, our targets need to be one-hot encoded. The paper describes this using these equations (where $\\lambda$ is the same as `t` in our code above):" + "For this to work, our targets need to be one-hot encoded. The paper describes this using the equations shown in <> where $\\lambda$ is the same as `t` in our pseudocode:" ] }, { @@ -747,16 +747,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Papers and math" + "### Sidebar: Papers and Math" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We're going to be looking at more and more research papers from here on in the book. Now that you have the basic jargon, you might be surprised to discover how much of them you can understand, with a little practice! One issue you'll notice is that greek letters, such as $\\lambda$, appear in most papers. It's a very good idea to learn the names of all the greek letters, since otherwise it's very hard to read the papers to yourself, and remember them, and it's also hard to read code based on them (since code often uses the name of the greek letter spelled out, such as `lambda`).\n", + "We're going to be looking at more and more research papers from here on in the book. Now that you have the basic jargon, you might be surprised to discover how much of them you can understand, with a little practice! One issue you'll notice is that Greek letters, such as $\\lambda$, appear in most papers. It's a very good idea to learn the names of all the Greek letters, since otherwise it's very hard to read the papers to yourself, and remember them (or to read code based on them, since code often uses the names of the Greek letters spelled out, such as `lambda`).\n", "\n", - "The bigger issue with papers is that they use math, instead of code, to explain what's going on. If you don't have much of a math background, this will likely be intimidating and confusing at first. But remember: what is being shown in the math, is something that will be implemented in code. It's just another way of talking about the same thing! After reading a few papers, you'll pick up more and more of the notation. If you don't know what a symbol is, try looking it up on Wikipedia's [list of mathematical symbols](https://en.wikipedia.org/wiki/List_of_mathematical_symbols) or draw it on [Detexify](http://detexify.kirelabs.org/classify.html) which (using machine learning!) will find the name of your hard-drawn symbol. Then you can search online for that name to find out what it's for." + "The bigger issue with papers is that they use math, instead of code, to explain what's going on. If you don't have much of a math background, this will likely be intimidating and confusing at first. But remember: what is being shown in the math, is something that will be implemented in code. It's just another way of talking about the same thing! After reading a few papers, you'll pick up more and more of the notation. If you don't know what a symbol is, try looking it up in Wikipedia's [list of mathematical symbols](https://en.wikipedia.org/wiki/List_of_mathematical_symbols) or drawing it in [Detexify](http://detexify.kirelabs.org/classify.html), which (using machine learning!) will find the name of your hand-drawn symbol. Then you can search online for that name to find out what it's for." ] }, { @@ -770,7 +770,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Here's what it looks like when we take a *linear combination* of images, as done in Mixup:" + "<> shows what it looks like when we take a *linear combination* of images, as done in Mixup." ] }, { @@ -815,11 +815,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The third image is built by adding 0.3 times the first one and 0.7 times the second. In this example, should the model predict church? gas station? The right answer is 30% church and 70% gas station since that's what we'll get if we take the linear combination of the one-hot encoded targets. For instance, if *church* has for index 2 and *gas station* as for index 7, the one-hot-encoded representations are\n", + "The third image is built by adding 0.3 times the first one and 0.7 times the second. In this example, should the model predict \"church\" or \"gas station\"? The right answer is 30% church and 70% gas station, since that's what we'll get if we take the linear combination of the one-hot-encoded targets. For instance, suppose we have 10 classes and \"church\" is represented by the index 2 and \"gas station\" is reprsented by the index 7, the one-hot-encoded representations are:\n", "```\n", "[0, 0, 1, 0, 0, 0, 0, 0, 0, 0] and [0, 0, 0, 0, 0, 0, 0, 1, 0, 0]\n", "```\n", - "(since we have ten classes in total) so our final target is\n", + "so our final target is:\n", "```\n", "[0, 0, 0.3, 0, 0, 0, 0, 0.7, 0, 0]\n", "```" @@ -829,9 +829,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This all done for us inside fastai by adding a `Callback` to our `Learner`. `Callback`s are what is used inside fastai to inject custom behavior in the training loop (like a learning rate schedule, or training in mixed precision). We'll be learning all about callbacks, including how to make your own, in <>. For now, all you need to know is that you use the `cbs` parameter to `Learner` to pass callbacks.\n", + "This all done for us inside fastai by adding a *callback* to our `Learner`. `Callback`s are what is used inside fastai to inject custom behavior in the training loop (like a learning rate schedule, or training in mixed precision). We'll be learning all about callbacks, including how to make your own, in <>. For now, all you need to know is that you use the `cbs` parameter to `Learner` to pass callbacks.\n", "\n", - "Here is how you train a model with Mixup:\n", + "Here is how we train a model with Mixup:\n", "\n", "```python\n", "model = xresnet50()\n", @@ -845,70 +845,70 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "So what happens if we train a model where our data is \"mixed up\" in this way? Clearly, it's going to be harder to train, because it's harder to see what's in each image. And the model has to predict two labels per image, rather than just one, as well as figuring out how much each one is weighted. Overfitting seems less likely to be a problem, because we're not showing the same image each epoch, but are instead showing a random combination of two images.\n", + "What happens when we train a model with data that's \"mixed up\" in this way? Clearly, it's going to be harder to train, because it's harder to see what's in each image. And the model has to predict two labels per image, rather than just one, as well as figuring out how much each one is weighted. Overfitting seems less likely to be a problem, however, because we're not showing the same image in each epoch, but are instead showing a random combination of two images.\n", "\n", - "Mixup requires far more epochs to train to a better accuracy, compared to other augmentation approaches we've seen. You can try training Imagenette with and without Mixup by using the `examples/train_imagenette.py` script in the fastai repo. At the time of writing, the leaderboard in the [Imagenette repo](https://github.com/fastai/imagenette/) is showing that mixup is used for all leading results for trainings of >80 epochs, and for few epochs Mixup is not being used. This is inline with our experience of using Mixup too.\n", + "Mixup requires far more epochs to train to get better accuracy, compared to other augmentation approaches we've seen. You can try training Imagenette with and without Mixup by using the *examples/train_imagenette.py* script in the [fastai repo](https://github.com/fastai/fastai). At the time of writing, the leaderboard in the [Imagenette repo](https://github.com/fastai/imagenette/) is showing that Mixup is used for all leading results for trainings of >80 epochs, and for fewer epochs Mixup is not being used. This is in line with our experience of using Mixup too.\n", "\n", - "One of the reasons that mixup is so exciting is that it can be applied to types of data other than photos. In fact, some people have even shown good results by using mixup on activations *inside* their model, not just on inputs--these allows Mixup to be used for NLP and other data types too.\n", + "One of the reasons that Mixup is so exciting is that it can be applied to types of data other than photos. In fact, some people have even shown good results by using Mixup on activations *inside* their models, not just on inputs--this allows Mixup to be used for NLP and other data types too.\n", "\n", - "There's another subtle issue that Mixup deals with for us, which is that it's not actually possible with the models we've seen before for our loss to ever be perfect. The problem is that our labels are ones and zeros, but softmax and sigmoid *never* can equal one or zero. So when we train our model, it causes it to push our activations ever closer to zero and one, such that the more epochs we do, the more extreme our activations become.\n", + "There's another subtle issue that Mixup deals with for us, which is that it's not actually possible with the models we've seen before for our loss to ever be perfect. The problem is that our labels are 1s and 0s, but the outputs of softmax and sigmoid can never equal 1 or 0. This means training our model pushes our activations ever closer to those values, such that the more epochs we do, the more extreme our activations become.\n", "\n", - "With Mixup, we no longer have that problem, because our labels will only be exactly one or zero if we happen to \"mix\" with another image of the same class. The rest of the time, our labels will be a linear combination, such as the 0.7 and 0.3 we got in the church and gas station example above.\n", + "With Mixup we no longer have that problem, because our labels will only be exactly 1 or 0 if we happen to \"mix\" with another image of the same class. The rest of the time our labels will be a linear combination, such as the 0.7 and 0.3 we got in the church and gas station example earlier.\n", "\n", - "One issue with this, however, is that Mixup is \"accidentally\" making the labels bigger than zero, or smaller than one. That is to say, we're not *explicitly* telling our model that we want to change the labels in this way. So if we want to change to make the labels closer, or further away, from zero and one, we have to change the amount of Mixup--which also changes the amount of data augmentation, which might not be what we want. There is, however, a way to handle this more directly, which is to use *label smoothing*." + "One issue with this, however, is that Mixup is \"accidentally\" making the labels bigger than 0, or smaller than 1. That is to say, we're not *explicitly* telling our model that we want to change the labels in this way. So, if we want to change to make the labels closer to, or further away from 0 and 1, we have to change the amount of Mixup--which also changes the amount of data augmentation, which might not be what we want. There is, however, a way to handle this more directly, which is to use *label smoothing*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Label smoothing" + "## Label Smoothing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In the theoretical expression of the loss, in classification problems, our targets are one-hot encoded (in practice we tend to avoid doing it to save memory, but what we compute is the same loss as if we had used one-hot encoding). That means the model is trained to return 0 for all categories but one, for which it is trained to return 1. Even 0.999 is not *good enough*, the model will get gradients and learn to predict activations that are even more confident. This encourages overfitting and gives you at inference time a model that is not going to give meaningful probabilities: it will always say 1 for the predicted category even if it's not too sure, just because it was trained this way.\n", + "In the theoretical expression of loss, in classification problems, our targets are one-hot encoded (in practice we tend to avoid doing this to save memory, but what we compute is the same loss as if we had used one-hot encoding). That means the model is trained to return 0 for all categories but one, for which it is trained to return 1. Even 0.999 is not \"good enough\", the model will get gradients and learn to predict activations with even higher confidence. This encourages overfitting and gives you at inference time a model that is not going to give meaningful probabilities: it will always say 1 for the predicted category even if it's not too sure, just because it was trained this way.\n", "\n", - "It can become very harmful if your data is not perfectly labeled. In the bear classifier we studied in <>, we saw that some of the images were mislabeled, or contained two different kinds of bears. In general, your data will never be perfect. Even if the labels were manually produced by humans, they could make mistakes, or have differences of opinions on images harder to label.\n", + "This can become very harmful if your data is not perfectly labeled. In the bear classifier we studied in <>, we saw that some of the images were mislabeled, or contained two different kinds of bears. In general, your data will never be perfect. Even if the labels were manually produced by humans, they could make mistakes, or have differences of opinions on images that are harder to label.\n", "\n", - "Instead, we could replace all our `1`s by a number a bit less than `1`, and our `0`s by a number a bit more than `0`, and then train. This is called *label smoothing*. By encouraging your model to be less confident, label smoothing will make your training more robust, even if there is mislabeled data, and will produce a model that generalizes better at inference.\n", + "Instead, we could replace all our 1s with a number a bit less than 1, and our 0s by a number a bit more than 0, and then train. This is called *label smoothing*. By encouraging your model to be less confident, label smoothing will make your training more robust, even if there is mislabeled data. The result will be a model that generalizes better.\n", "\n", - "This is how label smoothing works in practice: we start with one-hot encoded labels, then replace all zeros by $\\frac{\\epsilon}{N}$ (that's the greek letter *epsilon*, which is what was used in the [paper which introduced label smoothing](https://arxiv.org/abs/1512.00567), and is used in the fastai code) where $N$ is the number of classes and $\\epsilon$ is a parameter (usually 0.1, which would mean we are 10% unsure of our labels). Since you want the labels to add up to 1, replace the 1 by $1-\\epsilon + \\frac{\\epsilon}{N}$. This way, we don't encourage the model to predict something overconfident: in our Imagenette example where we have 10 classes, the targets become something like:\n", + "This is how label smoothing works in practice: we start with one-hot-encoded labels, then replace all 0s with $\\frac{\\epsilon}{N}$ (that's the Greek letter *epsilon*, which is what was used in the [paper that introduced label smoothing](https://arxiv.org/abs/1512.00567) and is used in the fastai code), where $N$ is the number of classes and $\\epsilon$ is a parameter (usually 0.1, which would mean we are 10% unsure of our labels). Since we want the labels to add up to 1, replace the 1 by $1-\\epsilon + \\frac{\\epsilon}{N}$. This way, we don't encourage the model to predict something overconfidently. In our Imagenette example where we have 10 classes, the targets become something like (here for a target that corresponds to the index 3):\n", "```\n", "[0.01, 0.01, 0.01, 0.91, 0.01, 0.01, 0.01, 0.01, 0.01, 0.01]\n", "```\n", - "(here for a target that corresponds to the index 3). In practice, we don't want to one-hot encode the labels, and fortunately we won't need too (the one-hot encoding is just good to explain what label smoothing is and visualize it)." + "In practice, we don't want to one-hot encode the labels, and fortunately we won't need to (the one-hot encoding is just good to explain what label smoothing is and visualize it)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Label smoothing, the paper" + "### Sidebar: Label Smoothing, the Paper" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Here is how the reasoning behind label smoothing was explained in the paper:\n", + "Here is how the reasoning behind label smoothing was explained in the paper by Christian Szegedy et al.:\n", "\n", - "\"This maximum is not achievable for finite $z_k$ but is approached if $z_y\\gg z_k$ for all $k\\neq y$ -- that is, if the logit corresponding to the ground-truth label is much great than all other logits. This, however, can cause two problems. First, it may result in over-fitting: if the model learns to assign full probability to the ground-truth label for each training example, it is not guaranteed to generalize. Second, it encourages the differences between the largest logit and all others to become large, and this, combined with the bounded gradient $\\frac{\\partial\\ell}{\\partial z_k}$, reduces the ability of the model to adapt. Intuitively, this happens because the model becomes too confident about its predictions.\"" + "> : This maximum is not achievable for finite $z_k$ but is approached if $z_y\\gg z_k$ for all $k\\neq y$--that is, if the logit corresponding to the ground-truth label is much great than all other logits. This, however, can cause two problems. First, it may result in over-fitting: if the model learns to assign full probability to the ground-truth label for each training example, it is not guaranteed to generalize. Second, it encourages the differences between the largest logit and all others to become large, and this, combined with the bounded gradient $\\frac{\\partial\\ell}{\\partial z_k}$, reduces the ability of the model to adapt. Intuitively, this happens because the model becomes too confident about its predictions." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's practice our paper reading skills to try to interpret this. \"This maximum\" is refering to the previous section of the paper, which talked about the fact that `1` is the value of the label for the positive class. So any value (except infinity) can't result in `1` after sigmoid or softmax. In a paper, you won't normally see \"any value\" written, but instead it would get a symbol; in this case, it's $z_k$. This is helpful in a paper, because it can be refered to again later, and the reader knows what value is being discussed.\n", + "Let's practice our paper-reading skills to try to interpret this. \"This maximum\" is refering to the previous part of the paragraph, which talked about the fact that 1 is the value of the label for the positive class. So it's not possible for any value (except infinity) to result in 1 after sigmoid or softmax. In a paper, you won't normally see \"any value\" written; instead it will get a symbol, which in this case is $z_k$. This shorthand is helpful in a paper, because it can be refered to again later and the reader will know what value is being discussed.\n", "\n", - "Then it says: $z_y\\gg z_k$ for all $k\\neq y$. In this case, the paper immediately follows with \"that is...\", which is handy, because you can just read the English instead of the math. In the math, the $y$ is refering to the target ($y$ is defined earlier in the paper; sometimes it's hard to find where symbols are defined, but nearly all papers will define all their symbols somewhere), and $z_y$ is the activation corresponding to the target. So to get close to `1`, this activation needs to be much higher than all the others for that prediction.\n", + "Then it says \"if $z_y\\gg z_k$ for all $k\\neq y$.\" In this case, the paper immediately follows the math with an English description, which is handy because you can just read that. In the math, the $y$ is refering to the target ($y$ is defined earlier in the paper; sometimes it's hard to find where symbols are defined, but nearly all papers will define all their symbols somewhere), and $z_y$ is the activation corresponding to the target. So to get close to 1, this activation needs to be much higher than all the others for that prediction.\n", "\n", - "Next up is \"if the model learns to assign full probability to the ground-truth label for each training example, it is not guaranteed to generalize\". This is saying that making $z_y$ really big means we'll need large weights and large activations throughout our model. Large weights lead to \"bumpy\" functions, where a small change in input results in a big change to predictions. This is really bad for generalization, because it means just one pixel changing a bit could change our prediction entirely!\n", + "Next, consider the statement \"if the model learns to assign full probability to the ground-truth label for each training example, it is not guaranteed to generalize.\" This is saying that making $z_y$ really big means we'll need large weights and large activations throughout our model. Large weights lead to \"bumpy\" functions, where a small change in input results in a big change to predictions. This is really bad for generalization, because it means just one pixel changing a bit could change our prediction entirely!\n", "\n", - "Finally, we have \"it encourages the differences between the largest logit and all others to become large, and this, combined with the bounded gradient $\\frac{\\partial\\ell}{\\partial z_k}$, reduces the ability of the model to adapt\". The gradient of cross entropy, remember, is basically `output-target`, and both `output` and `target` are between zero and one. So the difference is between `-1` and `1`, which is why the paper says the gradient is \"bounded\" (it can't be infinite). Therefore our SGD steps are bounded too. \"Reduces the ability of the model to adapt\" means that it is hard for it to be updated in a transfer learning setting. This follows because the difference in loss due to incorrect predictions is unbounded, but we can only take a limited step each time." + "Finally, we have \"it encourages the differences between the largest logit and all others to become large, and this, combined with the bounded gradient $\\frac{\\partial\\ell}{\\partial z_k}$, reduces the ability of the model to adapt.\" The gradient of cross-entropy, remember, is basically `output - target`. Both `output` and `target` are between 0 and 1, so the difference is between `-1` and `1`, which is why the paper says the gradient is \"bounded\" (it can't be infinite). Therefore our SGD steps are bounded too. \"Reduces the ability of the model to adapt\" means that it is hard for it to be updated in a transfer learning setting. This follows because the difference in loss due to incorrect predictions is unbounded, but we can only take a limited step each time." ] }, { @@ -922,7 +922,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To use it in practice, we just have to change the loss function in our call to `Learner`:\n", + "To use this in practice, we just have to change the loss function in our call to `Learner`:\n", "\n", "```python\n", "model = xresnet50()\n", @@ -931,7 +931,7 @@ "learn.fit_one_cycle(5, 3e-3)\n", "```\n", "\n", - "Like Mixup, you won't generally see significant improvements from label smoothing until you train more epochs. Try it yourself and see: how many epochs do you have to train before label smoothing shows an improvement?" + "Like with Mixup, you won't generally see significant improvements from label smoothing until you train more epochs. Try it yourself and see: how many epochs do you have to train before label smoothing shows an improvement?" ] }, { @@ -949,7 +949,7 @@ "\n", "Most importantly, remember that if your dataset is big, there is no point prototyping on the whole thing. Find a small subset that is representative of the whole, like we did with Imagenette, and experiment on it.\n", "\n", - "In the next three chapters, we will look at the other applications directly supported by fastai: collaborative filtering, tabular and text. We will go back to computer vision in the next section of the book, with a deep dive in convolutional neural networks in <>. " + "In the next three chapters, we will look at the other applications directly supported by fastai: collaborative filtering, tabular modeling and working with text. We will go back to computer vision in the next section of the book, with a deep dive into convolutional neural networks in <>. " ] }, { @@ -972,23 +972,23 @@ "1. Is using TTA at inference slower or faster than regular inference? Why?\n", "1. What is Mixup? How do you use it in fastai?\n", "1. Why does Mixup prevent the model from being too confident?\n", - "1. Why does a training with Mixup for 5 epochs end up worse than a training without Mixup?\n", + "1. Why does training with Mixup for five epochs end up worse than training without Mixup?\n", "1. What is the idea behind label smoothing?\n", "1. What problems in your data can label smoothing help with?\n", - "1. When using label smoothing with 5 categories, what is the target associated with the index 1?\n", - "1. What is the first step to take when you want to prototype quick experiments on a new dataset." + "1. When using label smoothing with five categories, what is the target associated with the index 1?\n", + "1. What is the first step to take when you want to prototype quick experiments on a new dataset?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research\n", + "### Further Research\n", "\n", - "1. Use the fastai documentation to build a function that crops an image to a square in the four corners, then implement a TTA method that averages the predictions on a center crop and those four crops. Did it help? Is it better than the TTA method of fastai?\n", - "1. Find the Mixup paper on arxiv and read it. Pick one or two more recent articles introducing variants of Mixup and read them, then try to implement them on your problem.\n", - "1. Find the script training Imagenette using Mixup and use it as an example to build a script for a long training on your own project. Execute it and see if it helped.\n", - "1. Read the sidebar on the math of label smoothing, and look at the relevant section of the original paper, and see if you can follow it. Don't be afraid to ask for help!" + "1. Use the fastai documentation to build a function that crops an image to a square in each of the four corners, then implement a TTA method that averages the predictions on a center crop and those four crops. Did it help? Is it better than the TTA method of fastai?\n", + "1. Find the Mixup paper on arXiv and read it. Pick one or two more recent articles introducing variants of Mixup and read them, then try to implement them on your problem.\n", + "1. Find the script training Imagenette using Mixup and use it as an example to build a script for a long training on your own project. Execute it and see if it helps.\n", + "1. Read the sidebar \"Label Smoothing, the Paper\", look at the relevant section of the original paper and see if you can follow it. Don't be afraid to ask for help!" ] }, { diff --git a/08_collab.ipynb b/08_collab.ipynb index 31d4d55cf..d29166625 100644 --- a/08_collab.ipynb +++ b/08_collab.ipynb @@ -21,7 +21,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Collaborative filtering deep dive" + "# Collaborative Filtering Deep Dive" ] }, { @@ -48,7 +48,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## A first look at the data" + "## A First Look at the Data" ] }, { @@ -318,7 +318,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Learning the latent factors" + "## Learning the Latent Factors" ] }, { @@ -816,7 +816,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Collaborative filtering from scratch" + "## Collaborative Filtering from Scratch" ] }, { @@ -1214,7 +1214,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Weight decay" + "### Weight Decay" ] }, { @@ -1272,7 +1272,7 @@ "In practice though, it would be very inefficient (and maybe numerically unstable) to compute that big sum and add it to the loss. If you remember a little bit of high schoool math, you might recall that the derivative of `p**2` with respect to `p` is `2*p`, so adding that big sum to our loss is exactly the same as doing:\n", "\n", "``` python\n", - "weight.grad += wd * 2 * weight\n", + "parameters.grad += wd * 2 * parameters\n", "```\n", "\n", "In practice, since `wd` is a parameter that we choose, we can just make it twice as big, so we don't even need the `*2` in the above equation. To use weight decay in fastai, just pass `wd` in your call to fit:" @@ -1354,7 +1354,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating our own Embedding module" + "### Creating Our Own Embedding Module" ] }, { @@ -1601,7 +1601,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Interpreting embeddings and biases" + "## Interpreting Embeddings and Biases" ] }, { @@ -1903,7 +1903,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Embedding distance" + "### Embedding Distance" ] }, { @@ -1950,7 +1950,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Boot strapping a collaborative filtering model" + "## Boot Strapping a Collaborative Filtering Model" ] }, { @@ -1986,7 +1986,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Deep learning for collaborative filtering" + "## Deep Learning for Collaborative Filtering" ] }, { @@ -2238,7 +2238,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: kwargs and delegates" + "### Sidebar: Kwargs and Delegates" ] }, { @@ -2330,7 +2330,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research\n", + "### Further Research\n", "\n", "1. Take a look at all the differences between the `Embedding` version of `DotProductBias` and the `create_params` version, and try to understand why each of those changes is required. If you're not sure, try reverting each change, to see what happens. (NB: even the type of brackets used in `forward` has changed!)\n", "1. Find three other areas where collaborative filtering is being used, and find out what pros and cons of this approach in those areas.\n", diff --git a/09_tabular.ipynb b/09_tabular.ipynb index 4352039f0..994aea55f 100644 --- a/09_tabular.ipynb +++ b/09_tabular.ipynb @@ -4,7 +4,7 @@ "cell_type": "code", "execution_count": null, "metadata": { - "hide_input": true + "hide_input": false }, "outputs": [ { @@ -41,7 +41,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Tabular modelling deep dive" + "# Tabular Modeling Deep Dive" ] }, { @@ -57,7 +57,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Categorical embeddings" + "## Categorical Embeddings" ] }, { @@ -187,7 +187,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Beyond deep learning" + "## Beyond Deep Learning" ] }, { @@ -239,7 +239,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The dataset" + "## The Dataset" ] }, { @@ -390,7 +390,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Look at the data" + "### Look at the Data" ] }, { @@ -542,7 +542,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Decision trees" + "## Decision Trees" ] }, { @@ -591,7 +591,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Handling dates" + "### Handling Dates" ] }, { @@ -1405,7 +1405,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating the decision tree" + "### Creating the Decision Tree" ] }, { @@ -7418,7 +7418,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Categorical variables" + "### Categorical Variables" ] }, { @@ -7454,7 +7454,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Random forests" + "## Random Forests" ] }, { @@ -7493,7 +7493,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating a random forest" + "### Creating a Random Forest" ] }, { @@ -7662,7 +7662,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Out-of-bag error" + "### Out-of-Bag Error" ] }, { @@ -7721,7 +7721,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Model interpretation" + "## Model Interpretation" ] }, { @@ -7743,7 +7743,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Tree variance for prediction confidence" + "### Tree Variance for Prediction Confidence" ] }, { @@ -7840,7 +7840,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Feature importance" + "### Feature Importance" ] }, { @@ -8020,7 +8020,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Removing low-importance variables" + "### Removing Low-Importance Variables" ] }, { @@ -8173,7 +8173,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Removing redundant features" + "### Removing Redundant Features" ] }, { @@ -8396,14 +8396,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Tk add transition" + "Now that we know which variable influence the most our predictions, we can have a look at how they affect the results using partial dependence plots." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Partial dependence" + "### Partial Dependence" ] }, { @@ -8526,7 +8526,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Data leakage" + "### Data Leakage" ] }, { @@ -8570,7 +8570,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Tree interpreter" + "### Tree Interpreter" ] }, { @@ -8716,7 +8716,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Extrapolation and neural networks" + "## Extrapolation and Neural Networks" ] }, { @@ -8730,7 +8730,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### The extrapolation problem" + "### The Extrapolation Problem" ] }, { @@ -8890,7 +8890,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Finding out of domain data" + "### Finding out of Domain Data" ] }, { @@ -9139,7 +9139,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Using a neural network" + "### Using a Neural Network" ] }, { @@ -9560,7 +9560,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: fastai's Tabular classes" + "### Sidebar: fastai's Tabular Classes" ] }, { @@ -9700,7 +9700,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Combining embeddings with other methods" + "### Combining Embeddings with Other Methods" ] }, { @@ -9732,7 +9732,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Conclusion: our advice for tabular modeling" + "## Conclusion: Our Advice for Tabular Modeling" ] }, { @@ -9802,7 +9802,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/10_nlp.ipynb b/10_nlp.ipynb index 3a418dd75..83d289880 100644 --- a/10_nlp.ipynb +++ b/10_nlp.ipynb @@ -22,7 +22,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# NLP deep dive: RNNs" + "# NLP Deep Dive: RNNs" ] }, { @@ -74,7 +74,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Text preprocessing" + "## Text Preprocessing" ] }, { @@ -142,7 +142,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Word tokenization with fastai" + "### Word Tokenization with fastai" ] }, { @@ -395,7 +395,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Subword tokenization" + "### Subword Tokenization" ] }, { @@ -732,7 +732,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Putting our texts into batches for a language model" + "### Putting Our Texts Into Batches for a Language Model" ] }, { @@ -1271,7 +1271,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Training a text classifier" + "## Training a Text Classifier" ] }, { @@ -1287,7 +1287,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Language model using DataBlock" + "### Language Model Using DataBlock" ] }, { @@ -1380,7 +1380,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Fine tuning the language model" + "### Fine Tuning the Language Model" ] }, { @@ -1478,7 +1478,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Saving and loading models" + "### Saving and Loading Models" ] }, { @@ -1670,7 +1670,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Text generation" + "### Text Generation" ] }, { @@ -1745,7 +1745,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating the classifier DataLoaders" + "### Creating the Classifier DataLoaders" ] }, { @@ -1918,7 +1918,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Fine tuning the classifier" + "### Fine Tuning the Classifier" ] }, { @@ -2136,7 +2136,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Disinformation and language models" + "## Disinformation and Language Models" ] }, { @@ -2249,7 +2249,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/11_midlevel_data.ipynb b/11_midlevel_data.ipynb index 6c6c4fdcd..731602dfc 100644 --- a/11_midlevel_data.ipynb +++ b/11_midlevel_data.ipynb @@ -22,7 +22,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Data munging with fastai's mid-level API" + "# Data Munging With fastai's mid-Level API" ] }, { @@ -36,7 +36,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Going deeper into fastai's layered API" + "## Going Deeper into fastai's Layered API" ] }, { @@ -273,7 +273,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Writing your own Transform" + "### Writing Your Own Transform" ] }, { @@ -480,7 +480,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## TfmdLists and Datasets: Transformed collections" + "## TfmdLists and Datasets: Transformed Collections" ] }, { @@ -909,7 +909,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Applying the mid-tier data API: SiamesePair" + "## Applying the mid-Tier Data API: SiamesePair" ] }, { @@ -1246,7 +1246,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { @@ -1261,7 +1261,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Becoming a deep learning practitioner" + "## Becoming a Deep Learning Practitioner" ] }, { diff --git a/12_nlp_dive.ipynb b/12_nlp_dive.ipynb index bfb8f14a8..93c810e2b 100644 --- a/12_nlp_dive.ipynb +++ b/12_nlp_dive.ipynb @@ -21,7 +21,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# A language model from scratch" + "# A Language Model from Scratch" ] }, { @@ -37,7 +37,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The data" + "## The Data" ] }, { @@ -255,7 +255,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Our first language model from scratch" + "## Our First Language Model from Scratch" ] }, { @@ -350,7 +350,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Our language model in PyTorch" + "### Our Language Model in PyTorch" ] }, { @@ -559,7 +559,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Our first recurrent neural network" + "### Our First Recurrent Neural Network" ] }, { @@ -726,7 +726,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Maintaining the state of an RNN" + "### Maintaining the State of an RNN" ] }, { @@ -990,7 +990,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating more signal" + "### Creating More Signal" ] }, { @@ -1309,7 +1309,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The model" + "## The Model" ] }, { @@ -1493,7 +1493,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Exploding or disappearing activations" + "### Exploding or Disappearing Activations" ] }, { @@ -1552,7 +1552,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Building an LSTM from scratch" + "### Building an LSTM from Scratch" ] }, { @@ -1702,7 +1702,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Training a language model using LSTMs" + "### Training a Language Model Using LSTMs" ] }, { @@ -1977,7 +1977,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### AR and TAR regularization" + "### AR and TAR Regularization" ] }, { @@ -2010,7 +2010,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Training a weight-tied regularized LSTM" + "### Training a Weight-Tied Regularized LSTM" ] }, { @@ -2318,7 +2318,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/13_convolutions.ipynb b/13_convolutions.ipynb index 0504cc580..898b3eb3b 100644 --- a/13_convolutions.ipynb +++ b/13_convolutions.ipynb @@ -24,7 +24,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Convolutional neural networks" + "# Convolutional Neural Networks" ] }, { @@ -40,7 +40,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The magic of convolutions" + "## The Magic of Convolutions" ] }, { @@ -1378,7 +1378,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Mapping a convolution kernel" + "### Mapping a Convolution Kernel" ] }, { @@ -1743,7 +1743,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Strides and padding" + "### Strides and Padding" ] }, { @@ -1808,7 +1808,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Understanding the convolution equations" + "### Understanding the Convolution Equations" ] }, { @@ -1929,7 +1929,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Our first convolutional neural network" + "## Our First Convolutional Neural Network" ] }, { @@ -2292,7 +2292,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Understanding convolution arithmetic" + "### Understanding Convolution Arithmetic" ] }, { @@ -2402,7 +2402,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Receptive fields" + "### Receptive Fields" ] }, { @@ -2453,7 +2453,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A note about Twitter" + "### A Note about Twitter" ] }, { @@ -2536,7 +2536,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Colour images" + "## Colour Images" ] }, { @@ -2687,7 +2687,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Improving training stability" + "## Improving Training Stability" ] }, { @@ -2801,7 +2801,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A simple baseline" + "### A Simple Baseline" ] }, { @@ -3003,7 +3003,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Increase batch size" + "### Increase Batch Size" ] }, { @@ -3103,7 +3103,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1cycle training" + "### 1cycle Training" ] }, { @@ -3362,7 +3362,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Batch normalization" + "### Batch Normalization" ] }, { @@ -3723,7 +3723,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/14_resnet.ipynb b/14_resnet.ipynb index b97dc40c4..1b035d942 100644 --- a/14_resnet.ipynb +++ b/14_resnet.ipynb @@ -39,7 +39,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Going back to Imagenette" + "## Going Back to Imagenette" ] }, { @@ -319,7 +319,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Building a modern CNN: ResNet" + "## Building a Modern CNN: ResNet" ] }, { @@ -333,7 +333,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Skip-connections" + "### Skip-Connections" ] }, { @@ -689,7 +689,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A state-of-the-art ResNet" + "### A State-of-the-Art ResNet" ] }, { @@ -935,7 +935,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Bottleneck layers" + "### Bottleneck Layers" ] }, { @@ -1244,7 +1244,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/15_arch_details.ipynb b/15_arch_details.ipynb index aa292893f..00518cd9a 100644 --- a/15_arch_details.ipynb +++ b/15_arch_details.ipynb @@ -21,7 +21,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Application architectures deep dive" + "# Application Architectures Deep Dive" ] }, { @@ -39,7 +39,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Computer vision" + "## Computer Vision" ] }, { @@ -242,7 +242,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A Siamese network" + "### A Siamese Network" ] }, { @@ -579,7 +579,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Natural language processing" + "## Natural Language Processing" ] }, { @@ -707,7 +707,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Wrapping up architectures" + "## Wrapping up Architectures" ] }, { @@ -778,7 +778,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/16_accel_sgd.ipynb b/16_accel_sgd.ipynb index 8ae55c6c7..296025dbc 100644 --- a/16_accel_sgd.ipynb +++ b/16_accel_sgd.ipynb @@ -23,7 +23,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# The training process" + "# The Training Process" ] }, { @@ -47,7 +47,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Let's start with SGD" + "## Let's Start with SGD" ] }, { @@ -305,7 +305,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## A generic optimizer" + "## A Generic Optimizer" ] }, { @@ -871,7 +871,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Decoupled weight_decay" + "## Decoupled Weight Decay" ] }, { @@ -1010,7 +1010,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating a callback" + "### Creating a Callback" ] }, { @@ -1146,7 +1146,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Callback ordering and exceptions" + "### Callback Ordering and Exceptions" ] }, { @@ -1269,7 +1269,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/17_foundations.ipynb b/17_foundations.ipynb index 94c49cb0e..4d6717b4e 100644 --- a/17_foundations.ipynb +++ b/17_foundations.ipynb @@ -23,7 +23,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# A neural net from the foundations" + "# A Neural Net from the Foundations" ] }, { @@ -39,7 +39,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## A neural net layer from scratch" + "## A Neural Net Layer from Scratch" ] }, { @@ -53,7 +53,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Modeling a neuron" + "### Modeling a Neuron" ] }, { @@ -117,7 +117,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Matrix multiplication from scratch" + "### Matrix Multiplication from Scratch" ] }, { @@ -242,8 +242,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n", - "### Elementwise arithmetic" + "### Elementwise Arithmetic" ] }, { @@ -1091,7 +1090,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Einstein summation" + "### Einstein Summation" ] }, { @@ -1182,7 +1181,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The forward and backward passes" + "## The Forward and Backward Passes" ] }, { @@ -1196,7 +1195,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Defining and initializing a layer" + "### Defining and Initializing a Layer" ] }, { @@ -1766,7 +1765,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Gradients and backward pass" + "### Gradients and Backward Pass" ] }, { @@ -1971,7 +1970,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Refactor the model" + "### Refactor the Model" ] }, { @@ -2421,7 +2420,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/18_CAM.ipynb b/18_CAM.ipynb index 3cc27a4d1..4aeca1069 100644 --- a/18_CAM.ipynb +++ b/18_CAM.ipynb @@ -23,7 +23,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# CNN interpretation with CAM" + "# CNN Interpretation with CAM" ] }, { @@ -39,7 +39,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## CAM and hooks" + "## CAM and Hooks" ] }, { @@ -633,7 +633,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/19_learner.ipynb b/19_learner.ipynb index a815e55b8..c0bd225bf 100644 --- a/19_learner.ipynb +++ b/19_learner.ipynb @@ -14,7 +14,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# fastai Learner from scratch" + "# fastai Learner from Scratch" ] }, { @@ -1548,7 +1548,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Scheduling the learning rate" + "### Scheduling the Learning Rate" ] }, { @@ -1861,7 +1861,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/20_conclusion.ipynb b/20_conclusion.ipynb index d7b5ef19a..fb5a39d58 100644 --- a/20_conclusion.ipynb +++ b/20_conclusion.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Concluding thoughts" + "# Concluding Thoughts" ] }, { diff --git a/app_blog.ipynb b/app_blog.ipynb index 840eef5e8..2d8b9db71 100644 --- a/app_blog.ipynb +++ b/app_blog.ipynb @@ -4,6 +4,7 @@ "cell_type": "raw", "metadata": {}, "source": [ + "[[appendix_blog]]\n", "[appendix]\n", "[role=\"Creating a blog\"]" ] @@ -23,7 +24,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Creating a blog" + "# Creating a Blog" ] }, { @@ -48,7 +49,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating the repository" + "### Creating the Repository" ] }, { @@ -72,7 +73,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Setting up your homepage" + "### Setting up Your Homepage" ] }, { @@ -136,7 +137,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating posts" + "### Creating Posts" ] }, { @@ -226,7 +227,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Synchronizing GitHub and your computer" + "### Synchronizing GitHub and Your Computer" ] }, { @@ -262,7 +263,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Jupyter for blogging" + "### Jupyter for Blogging" ] }, { diff --git a/clean/01_intro.ipynb b/clean/01_intro.ipynb index e346f1d77..0c52130cf 100644 --- a/clean/01_intro.ipynb +++ b/clean/01_intro.ipynb @@ -14,70 +14,70 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Your deep learning journey" + "# Your Deep Learning Journey" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Deep learning is for everyone" + "## Deep Learning Is for Everyone" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Neural networks: a brief history" + "## Neural Networks: A Brief History" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Who we are" + "## Who We Are" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## How to learn deep learning" + "## How to Learn Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Your projects and your mindset" + "### Your Projects and Your Mindset" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## The software: PyTorch, fastai, and Jupyter" + "## The Software: PyTorch, fastai, and Jupyter" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Your first model" + "## Your First Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Getting a GPU deep learning server" + "### Getting a GPU Deep Learning Server" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Running your first notebook" + "### Running Your First Notebook" ] }, { @@ -166,7 +166,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: This book was written in Jupyter Notebooks" + "### Sidebar: This Book Was Written in Jupyter Notebooks" ] }, { @@ -291,7 +291,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### What is machine learning?" + "### What Is Machine Learning?" ] }, { @@ -627,14 +627,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### What is a neural network?" + "### What Is a Neural Network?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "#### A bit of deep learning jargon" + "### A Bit of Deep Learning Jargon" ] }, { @@ -757,53 +757,53 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Limitations inherent to machine learning\n", + "### Limitations Inherent To Machine Learning\n", "\n", "From this picture we can now see some fundamental things about training a deep learning model:\n", "\n", - "- A model cannot be created without data ;\n", - "- A model can only learn to operate on the patterns seen in the input data used to train it ;\n", - "- This learning approach only creates *predictions*, not recommended *actions* ;\n", - "- It's not enough to just have examples of input data; we need *labels* for that data too (e.g. pictures of dogs and cats aren't enough to train a model; we need a label for each one, saying which ones are dogs, and which are cats).\n", + "- A model cannot be created without data.\n", + "- A model can only learn to operate on the patterns seen in the input data used to train it.\n", + "- This learning approach only creates *predictions*, not recommended *actions*.\n", + "- It's not enough to just have examples of input data; we need *labels* for that data too (e.g., pictures of dogs and cats aren't enough to train a model; we need a label for each one, saying which ones are dogs, and which are cats).\n", "\n", - "Generally speaking, we've seen that most organizations that think they don't have enough data, actually mean they don't have enough *labeled* data. If any organization is interested in doing something in practice with a model, then presumably they have some inputs they plan to run their model against. And presumably they've been doing that some other way for a while (e.g. manually, or with some heuristic program), so they have data from those processes! For instance, a radiology practice will almost certainly have an archive of medical scans (since they need to be able to check how their patients are progressing over time), but those scans may not have structured labels containing a list of diagnoses or interventions (since radiologists generally create free text natural language reports, not structured data). We'll be discussing labeling approaches a lot in this book, since it's such an important issue in practice.\n", + "Generally speaking, we've seen that most organizations that say they don't have enough data, actually mean they don't have enough *labeled* data. If any organization is interested in doing something in practice with a model, then presumably they have some inputs they plan to run their model against. And presumably they've been doing that some other way for a while (e.g., manually, or with some heuristic program), so they have data from those processes! For instance, a radiology practice will almost certainly have an archive of medical scans (since they need to be able to check how their patients are progressing over time), but those scans may not have structured labels containing a list of diagnoses or interventions (since radiologists generally create free-text natural language reports, not structured data). We'll be discussing labeling approaches a lot in this book, because it's such an important issue in practice.\n", "\n", - "Since these kinds of machine learning models can only make *predictions* (i.e. attempt to replicate labels), this can result in a significant gap between organizational goals and model capabilities. For instance, in this book you'll learn how to create a *recommendation system* that can predict what products a user might purchase. This is often used in e-commerce, such as to customize products shown on a home page, by showing the highest-ranked items. But such a model is generally created by looking at a user and their buying history (*inputs*) and what they went on to buy or look at (*labels*), which means that the model is likely to tell you about products they already have, or already know about, rather than new products that they are most likely to be interested in hearing about. That's very different to what, say, an expert at your local bookseller might do, where they ask questions to figure out your taste, and then tell you about authors or series that you've never heard of before." + "Since these kinds of machine learning models can only make *predictions* (i.e., attempt to replicate labels), this can result in a significant gap between organizational goals and model capabilities. For instance, in this book you'll learn how to create a *recommendation system* that can predict what products a user might purchase. This is often used in e-commerce, such as to customize products shown on a home page by showing the highest-ranked items. But such a model is generally created by looking at a user and their buying history (*inputs*) and what they went on to buy or look at (*labels*), which means that the model is likely to tell you about products the user already has or already knows about, rather than new products that they are most likely to be interested in hearing about. That's very different to what, say, an expert at your local bookseller might do, where they ask questions to figure out your taste, and then tell you about authors or series that you've never heard of before." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### How our image recognizer works" + "### How Our Image Recognizer Works" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### What our image recognizer learned" + "### What Our Image Recognizer Learned" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Image recognizers can tackle non-image tasks" + "### Image Recognizers Can Tackle Non-Image Tasks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Jargon recap" + "### Jargon Recap" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Deep learning is not just for image classification" + "## Deep Learning Is Not Just for Image Classification" ] }, { @@ -1114,7 +1114,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: The order matters" + "### Sidebar: The Order Matters" ] }, { @@ -1441,7 +1441,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Datasets: food for models" + "### Sidebar: Datasets: Food for Models" ] }, { @@ -1455,14 +1455,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Validation sets and test sets" + "## Validation Sets and Test Sets" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Use judgment in defining test sets" + "### Use Judgment in Defining Test Sets" ] }, { @@ -1483,7 +1483,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "It can be hard to know in pages and pages of prose what are the key things you really need to focus on and remember. So we've prepared a list of questions and suggested steps to complete at the end of each chapter. All the answers are in the text of the chapter, so if you're not sure about anything here, re-read that part of the text and make sure you understand it. Answers to all these questions are also available on the [book website](https://book.fast.ai). You can also visit [the forums](https://forums.fast.ai) if you get stuck to get help from other folks studying this material." + "It can be hard to know in pages and pages of prose what the key things are that you really need to focus on and remember. So, we've prepared a list of questions and suggested steps to complete at the end of each chapter. All the answers are in the text of the chapter, so if you're not sure about anything here, reread that part of the text and make sure you understand it. Answers to all these questions are also available on the [book's website](https://book.fast.ai). You can also visit [the forums](https://forums.fast.ai) if you get stuck to get help from other folks studying this material." ] }, { @@ -1491,33 +1491,35 @@ "metadata": {}, "source": [ "1. Do you need these for deep learning?\n", + "\n", " - Lots of math T / F\n", " - Lots of data T / F\n", " - Lots of expensive computers T / F\n", " - A PhD T / F\n", + " \n", "1. Name five areas where deep learning is now the best in the world.\n", "1. What was the name of the first device that was based on the principle of the artificial neuron?\n", - "1. Based on the book of the same name, what are the requirements for \"Parallel Distributed Processing\"?\n", + "1. Based on the book of the same name, what are the requirements for parallel distributed processing (PDP)?\n", "1. What were the two theoretical misunderstandings that held back the field of neural networks?\n", "1. What is a GPU?\n", "1. Open a notebook and execute a cell containing: `1+1`. What happens?\n", "1. Follow through each cell of the stripped version of the notebook for this chapter. Before executing each cell, guess what will happen.\n", "1. Complete the Jupyter Notebook online appendix.\n", "1. Why is it hard to use a traditional computer program to recognize images in a photo?\n", - "1. What did Samuel mean by \"Weight Assignment\"?\n", - "1. What term do we normally use in deep learning for what Samuel called \"Weights\"?\n", - "1. Draw a picture that summarizes Arthur Samuel's view of a machine learning model\n", + "1. What did Samuel mean by \"weight assignment\"?\n", + "1. What term do we normally use in deep learning for what Samuel called \"weights\"?\n", + "1. Draw a picture that summarizes Samuel's view of a machine learning model.\n", "1. Why is it hard to understand why a deep learning model makes a particular prediction?\n", - "1. What is the name of the theorem that a neural network can solve any mathematical problem to any level of accuracy?\n", + "1. What is the name of the theorem that shows that a neural network can solve any mathematical problem to any level of accuracy?\n", "1. What do you need in order to train a model?\n", "1. How could a feedback loop impact the rollout of a predictive policing model?\n", - "1. Do we always have to use 224x224 pixel images with the cat recognition model?\n", + "1. Do we always have to use 224\\*224-pixel images with the cat recognition model?\n", "1. What is the difference between classification and regression?\n", "1. What is a validation set? What is a test set? Why do we need them?\n", "1. What will fastai do if you don't provide a validation set?\n", "1. Can we always use a random sample for a validation set? Why or why not?\n", "1. What is overfitting? Provide an example.\n", - "1. What is a metric? How does it differ to \"loss\"?\n", + "1. What is a metric? How does it differ from \"loss\"?\n", "1. How can pretrained models help?\n", "1. What is the \"head\" of a model?\n", "1. What kinds of features do the early layers of a CNN find? How about the later layers?\n", @@ -1533,14 +1535,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Each chapter also has a \"further research\" with questions that aren't fully answered in the text, or include more advanced assignments. Answers to these questions aren't on the book website--you'll need to do your own research!" + "Each chapter also has a \"Further Research\" section that poses questions that aren't fully answered in the text, or gives more advanced assignments. Answers to these questions aren't on the book's website; you'll need to do your own research!" ] }, { @@ -1548,8 +1550,15 @@ "metadata": {}, "source": [ "1. Why is a GPU useful for deep learning? How is a CPU different, and why is it less effective for deep learning?\n", - "1. Try to think of three areas where feedback loops might impact use of machine learning. See if you can find documented examples of that happening in practice." + "1. Try to think of three areas where feedback loops might impact the use of machine learning. See if you can find documented examples of that happening in practice." ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/clean/02_production.ipynb b/clean/02_production.ipynb index c0afe32c9..ee7b0342b 100644 --- a/clean/02_production.ipynb +++ b/clean/02_production.ipynb @@ -15,28 +15,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# From model to production" + "# From Model to Production" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## The practice of deep learning" + "## The Practice of Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Starting your project" + "### Starting Your Project" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The state of deep learning" + "### The State of Deep Learning" ] }, { @@ -78,21 +78,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### The Drivetrain approach" + "#### Other data types" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Gathering data" + "### The Drivetrain Approach" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To download images with Bing Image Search, you should sign up at Microsoft for *Bing Image Search*. You will be given a key, which you can either paste here, replacing \"XXX\":" + "## Gathering Data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To download images with Bing Image Search, sign up at Microsoft for a free account. You will be given a key, which you can copy and enter in a cell as follows (replacing 'XXX' with your key and executing it):" ] }, { @@ -280,7 +287,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Getting help in Jupyter notebooks" + "### Sidebar: Getting Help in Jupyter Notebooks" ] }, { @@ -294,7 +301,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## From data to DataLoaders" + "## From Data to DataLoaders" ] }, { @@ -306,7 +313,7 @@ "bears = DataBlock(\n", " blocks=(ImageBlock, CategoryBlock), \n", " get_items=get_image_files, \n", - " splitter=RandomSplitter(valid_pct=0.3, seed=42),\n", + " splitter=RandomSplitter(valid_pct=0.2, seed=42),\n", " get_y=parent_label,\n", " item_tfms=Resize(128))" ] @@ -418,7 +425,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Data augmentation" + "### Data Augmentation" ] }, { @@ -449,7 +456,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Training your model, and using it to clean your data" + "## Training Your Model, and Using It to Clean Your Data" ] }, { @@ -673,14 +680,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Turning your model into an online application" + "## Turning Your Model into an Online Application" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Using the model for inference" + "### Using the Model for Inference" ] }, { @@ -776,7 +783,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating a Notebook app from the model" + "### Creating a Notebook App from the Model" ] }, { @@ -965,7 +972,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Turning your notebook into a real app" + "### Turning Your Notebook into a Real App" ] }, { @@ -990,21 +997,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## How to avoid disaster" + "## How to Avoid Disaster" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Unforeseen consequences and feedback loops" + "### Unforeseen Consequences and Feedback Loops" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Get writing!" + "## Get Writing!" ] }, { @@ -1018,21 +1025,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "1. Provide an example of where the bear classification model might work poorly, due to structural or style differences to the training data.\n", + "1. Provide an example of where the bear classification model might work poorly in production, due to structural or style differences in the training data.\n", "1. Where do text models currently have a major deficiency?\n", "1. What are possible negative societal implications of text generation models?\n", "1. In situations where a model might make mistakes, and those mistakes could be harmful, what is a good alternative to automating a process?\n", "1. What kind of tabular data is deep learning particularly good at?\n", "1. What's a key downside of directly using a deep learning model for recommendation systems?\n", - "1. What are the steps of the Drivetrain approach?\n", - "1. How do the steps of the Drivetrain approach map to a recommendation system?\n", + "1. What are the steps of the Drivetrain Approach?\n", + "1. How do the steps of the Drivetrain Approach map to a recommendation system?\n", "1. Create an image recognition model using data you curate, and deploy it on the web.\n", "1. What is `DataLoaders`?\n", "1. What four things do we need to tell fastai to create `DataLoaders`?\n", "1. What does the `splitter` parameter to `DataBlock` do?\n", "1. How do we ensure a random split always gives the same validation set?\n", "1. What letters are often used to signify the independent and dependent variables?\n", - "1. What's the difference between crop, pad, and squish resize approaches? When might you choose one over the other?\n", + "1. What's the difference between the crop, pad, and squish resize approaches? When might you choose one over the others?\n", "1. What is data augmentation? Why is it needed?\n", "1. What is the difference between `item_tfms` and `batch_tfms`?\n", "1. What is a confusion matrix?\n", @@ -1041,27 +1048,27 @@ "1. What are IPython widgets?\n", "1. When might you want to use CPU for deployment? When might GPU be better?\n", "1. What are the downsides of deploying your app to a server, instead of to a client (or edge) device such as a phone or PC?\n", - "1. What are 3 examples of problems that could occur when rolling out a bear warning system in practice?\n", - "1. What is \"out of domain data\"?\n", + "1. What are three examples of problems that could occur when rolling out a bear warning system in practice?\n", + "1. What is \"out-of-domain data\"?\n", "1. What is \"domain shift\"?\n", - "1. What are the 3 steps in the deployment process?\n", - "1. For a project you're interested in applying deep learning to, consider the thought experiment \"what would happen if it went really, really well?\"\n", - "1. Start a blog, and write your first blog post. For instance, write about what you think deep learning might be useful for in a domain you're interested in." + "1. What are the three steps in the deployment process?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "1. Consider how the Drivetrain approach maps to a project or problem you're interested in.\n", - "1. When might it be best to avoid certain types of data augmentation?" + "1. Consider how the Drivetrain Approach maps to a project or problem you're interested in.\n", + "1. When might it be best to avoid certain types of data augmentation?\n", + "1. For a project you're interested in applying deep learning to, consider the thought experiment \"What would happen if it went really, really well?\"\n", + "1. Start a blog, and write your first blog post. For instance, write about what you think deep learning might be useful for in a domain you're interested in." ] }, { diff --git a/clean/03_ethics.ipynb b/clean/03_ethics.ipynb index 92eb354f2..27ecafdc8 100644 --- a/clean/03_ethics.ipynb +++ b/clean/03_ethics.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Acknowledgement: Dr Rachel Thomas" + "### Sidebar: Acknowledgement: Dr. Rachel Thomas" ] }, { @@ -25,42 +25,42 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Key examples for data ethics" + "## Key Examples for Data Ethics" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Bugs and recourse: Buggy algorithm used for healthcare benefits" + "### Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Feedback loops: YouTube's recommendation system" + "### Feedback Loops: YouTube's Recommendation System" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Bias: Professor Lantanya Sweeney \"arrested\"" + "### Bias: Professor Lantanya Sweeney \"Arrested\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Why does this matter?" + "### Why Does This Matter?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Integrating machine learning with product design" + "## Integrating Machine Learning with Product Design" ] }, { @@ -74,14 +74,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Recourse and accountability" + "### Recourse and Accountability" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Feedback loops" + "### Feedback Loops" ] }, { @@ -109,77 +109,70 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "#### Aggregation Bias" + "#### Aggregation bias" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "#### Representation Bias" + "#### Representation bias" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Addressing different types of bias" + "### Addressing different types of bias" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Humans are biased, so does algorithmic bias matter?" + "### Disinformation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Disinformation" + "## Identifying and Addressing Ethical Issues" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Identifying and addressing ethical issues" + "### Analyze a Project You Are Working On" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Analyze a project you are working on" + "### Processes to Implement" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Processes to implement" + "#### Ethical lenses" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "#### Ethical Lenses" + "### The Power of Diversity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The power of diversity" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Fairness, accountability, and transparency" + "### Fairness, Accountability, and Transparency" ] }, { @@ -193,21 +186,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### The effectiveness of regulation" + "### The Effectiveness of Regulation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Rights and policy" + "### Rights and Policy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Cars: a historical precedent" + "### Cars: A Historical Precedent" ] }, { @@ -230,16 +223,16 @@ "source": [ "1. Does ethics provide a list of \"right answers\"?\n", "1. How can working with people of different backgrounds help when considering ethical questions?\n", - "1. What was the role of IBM in Nazi Germany? Why did the company participate as they did? Why did the workers participate?\n", - "1. What was the role of the first person jailed in the VW diesel scandal?\n", + "1. What was the role of IBM in Nazi Germany? Why did the company participate as it did? Why did the workers participate?\n", + "1. What was the role of the first person jailed in the Volkswagen diesel scandal?\n", "1. What was the problem with a database of suspected gang members maintained by California law enforcement officials?\n", - "1. Why did YouTube's recommendation algorithm recommend videos of partially clothed children to pedophiles, even though no employee at Google programmed this feature?\n", + "1. Why did YouTube's recommendation algorithm recommend videos of partially clothed children to pedophiles, even though no employee at Google had programmed this feature?\n", "1. What are the problems with the centrality of metrics?\n", - "1. Why did Meetup.com not include gender in their recommendation system for tech meetups?\n", + "1. Why did Meetup.com not include gender in its recommendation system for tech meetups?\n", "1. What are the six types of bias in machine learning, according to Suresh and Guttag?\n", "1. Give two examples of historical race bias in the US.\n", - "1. Where are most images in Imagenet from?\n", - "1. In the paper \"Does Machine Learning Automate Moral Hazard and Error\" why is sinusitis found to be predictive of a stroke?\n", + "1. Where are most images in ImageNet from?\n", + "1. In the paper [\"Does Machine Learning Automate Moral Hazard and Error\"](https://scholar.harvard.edu/files/sendhil/files/aer.p20171084.pdf) why is sinusitis found to be predictive of a stroke?\n", "1. What is representation bias?\n", "1. How are machines and people different, in terms of their use for making decisions?\n", "1. Is disinformation the same as \"fake news\"?\n", @@ -252,7 +245,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research:" + "### Further Research:" ] }, { @@ -260,12 +253,12 @@ "metadata": {}, "source": [ "1. Read the article \"What Happens When an Algorithm Cuts Your Healthcare\". How could problems like this be avoided in the future?\n", - "1. Research to find out more about YouTube's recommendation system and its societal impacts. Do you think recommendation systems must always have feedback loops with negative results? What approaches could Google take? What about the government?\n", - "1. Read the paper \"Discrimination in Online Ad Delivery\". Do you think Google should be considered responsible for what happened to Dr Sweeney? What would be an appropriate response?\n", + "1. Research to find out more about YouTube's recommendation system and its societal impacts. Do you think recommendation systems must always have feedback loops with negative results? What approaches could Google take to avoid them? What about the government?\n", + "1. Read the paper [\"Discrimination in Online Ad Delivery\"](https://arxiv.org/abs/1301.6822). Do you think Google should be considered responsible for what happened to Dr. Sweeney? What would be an appropriate response?\n", "1. How can a cross-disciplinary team help avoid negative consequences?\n", - "1. Read the paper \"Does Machine Learning Automate Moral Hazard and Error\" in American Economic Review. What actions do you think should be taken to deal with the issues identified in this paper?\n", + "1. Read the paper \"Does Machine Learning Automate Moral Hazard and Error\". What actions do you think should be taken to deal with the issues identified in this paper?\n", "1. Read the article \"How Will We Prevent AI-Based Forgery?\" Do you think Etzioni's proposed approach could work? Why?\n", - "1. Complete the section \"Analyze a project you are working on\" in this chapter.\n", + "1. Complete the section \"Analyze a Project You Are Working On\" in this chapter.\n", "1. Consider whether your team could be more diverse. If so, what approaches might help?" ] }, @@ -273,26 +266,26 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Section 1: that's a wrap!" + "## Section 1: That's a Wrap!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Congratulations! You've made it to the end of the first section of the book. In this section we've tried to show you what deep learning can do, and how you can use it to create real applications and products. At this point, you will get a lot more out of the book if you spend some time trying out what you've learnt. Perhaps you have already been doing this as you go along — in which case, great! But if not, that's no problem either… Now is a great time to start experimenting yourself.\n", + "Congratulations! You've made it to the end of the first section of the book. In this section we've tried to show you what deep learning can do, and how you can use it to create real applications and products. At this point, you will get a lot more out of the book if you spend some time trying out what you've learned. Perhaps you have already been doing this as you go along—in which case, great! If not, that's no problem either... Now is a great time to start experimenting yourself.\n", "\n", - "If you haven't been to the book website yet, head over there now. Remember, you can find it here: [book.fast.ai](https://book.fast.ai). It's really important that you have got yourself set up to run the notebooks. Becoming an effective deep learning practitioner is all about practice. So you need to be training models. So please go get the notebooks running now if you haven't already! And also have a look on the website for any important updates or notices; deep learning changes fast, and we can't change the words that are printed in this book, so the website is where you need to look to ensure you have the most up-to-date information.\n", + "If you haven't been to the [book's website](https://book.fast.ai) yet, head over there now. It's really important that you get yourself set up to run the notebooks. Becoming an effective deep learning practitioner is all about practice, so you need to be training models. So, please go get the notebooks running now if you haven't already! And also have a look on the website for any important updates or notices; deep learning changes fast, and we can't change the words that are printed in this book, so the website is where you need to look to ensure you have the most up-to-date information.\n", "\n", "Make sure that you have completed the following steps:\n", "\n", - "- Connected to one of the GPU Jupyter servers recommended on the book website\n", - "- Run the first notebook yourself\n", - "- Uploaded an image that you find in the first notebook; then try a few different images of different kinds to see what happens\n", - "- Run the second notebook, collecting your own dataset based on image search queries that you come up with\n", - "- Thought about how you can use deep learning to help you with your own projects, including what kinds of data you could use, what kinds of problems may come up, and how you might be able to mitigate these issues in practice.\n", + "- Connect to one of the GPU Jupyter servers recommended on the book's website.\n", + "- Run the first notebook yourself.\n", + "- Upload an image that you find in the first notebook; then try a few different images of different kinds to see what happens.\n", + "- Run the second notebook, collecting your own dataset based on image search queries that you come up with.\n", + "- Think about how you can use deep learning to help you with your own projects, including what kinds of data you could use, what kinds of problems may come up, and how you might be able to mitigate these issues in practice.\n", "\n", - "In the next section of the book we will learn about how and why deep learning works, instead of just seeing how we can use it in practice. Understanding the how and why is important for both practitioners and researchers, because in this fairly new field nearly every project requires some level of customisation and debugging. The better you understand the foundations of deep learning, the better your models will be. These foundations are less important for executives, product managers, and so forth (although still useful, so feel free to keep reading!), but they are critical for anybody who is actually training and deploying models themselves." + "In the next section of the book you will learn about how and why deep learning works, instead of just seeing how you can use it in practice. Understanding the how and why is important for both practitioners and researchers, because in this fairly new field nearly every project requires some level of customization and debugging. The better you understand the foundations of deep learning, the better your models will be. These foundations are less important for executives, product managers, and so forth (although still useful, so feel free to keep reading!), but they are critical for anybody who is actually training and deploying models themselves." ] }, { diff --git a/clean/04_mnist_basics.ipynb b/clean/04_mnist_basics.ipynb index bd0339236..d1d7f5d69 100644 --- a/clean/04_mnist_basics.ipynb +++ b/clean/04_mnist_basics.ipynb @@ -17,21 +17,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Under the hood: training a digit classifier" + "# Under the Hood: Training a Digit Classifier" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Pixels: the foundations of computer vision" + "## Pixels: The Foundations of Computer Vision" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Sidebar: Tenacity and deep learning" + "## Sidebar: Tenacity and Deep Learning" ] }, { @@ -1249,7 +1249,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## First try: pixel similarity" + "## First Try: Pixel Similarity" ] }, { @@ -1495,7 +1495,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### NumPy arrays and PyTorch tensors" + "### NumPy Arrays and PyTorch Tensors" ] }, { @@ -1677,7 +1677,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Computing metrics using broadcasting" + "## Computing Metrics Using Broadcasting" ] }, { @@ -2039,7 +2039,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### The gradient" + "### Calculating Gradients" ] }, { @@ -2170,14 +2170,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Stepping with a learning rate" + "### Stepping With a Learning Rate" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### An end-to-end SGD example" + "### An End-to-End SGD Example" ] }, { @@ -2243,6 +2243,13 @@ "def mse(preds, targets): return ((preds-targets)**2).mean()" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 1: Initialize the parameters" + ] + }, { "cell_type": "code", "execution_count": null, @@ -2262,6 +2269,13 @@ "orig_params = params.clone()" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 2: Calculate the predictions" + ] + }, { "cell_type": "code", "execution_count": null, @@ -2306,6 +2320,13 @@ "show_preds(preds)" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 3: Calculate the loss" + ] + }, { "cell_type": "code", "execution_count": null, @@ -2327,6 +2348,13 @@ "loss" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 4: Calculate the gradients" + ] + }, { "cell_type": "code", "execution_count": null, @@ -2388,6 +2416,13 @@ "params" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 5: Step the weights. " + ] + }, { "cell_type": "code", "execution_count": null, @@ -2458,6 +2493,13 @@ " return preds" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Step 6: Repeat the process " + ] + }, { "cell_type": "code", "execution_count": null, @@ -2522,7 +2564,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Summarizing gradient descent" + "#### Step 7: stop" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Summarizing Gradient Descent" ] }, { @@ -2642,7 +2691,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## MNIST loss function" + "## The MNIST Loss Function" ] }, { @@ -2993,7 +3042,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### SGD and mini-batches" + "### SGD and Mini-Batches" ] }, { @@ -3070,7 +3119,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Putting it all together" + "## Putting It All Together" ] }, { @@ -3411,7 +3460,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating an optimizer" + "### Creating an Optimizer" ] }, { @@ -3677,7 +3726,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Adding a non-linearity" + "## Adding a Nonlinearity" ] }, { @@ -4106,6 +4155,13 @@ "learn.recorder.values[-1][2]" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Going Deeper" + ] + }, { "cell_type": "code", "execution_count": null, @@ -4154,14 +4210,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Jargon recap" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### _Choose Your Own Adventure_ reminder" + "## Jargon Recap" ] }, { @@ -4175,20 +4224,20 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "1. How is a greyscale image represented on a computer? How about a color image?\n", + "1. How is a grayscale image represented on a computer? How about a color image?\n", "1. How are the files and folders in the `MNIST_SAMPLE` dataset structured? Why?\n", "1. Explain how the \"pixel similarity\" approach to classifying digits works.\n", "1. What is a list comprehension? Create one now that selects odd numbers from a list and doubles them.\n", - "1. What is a \"rank 3 tensor\"?\n", + "1. What is a \"rank-3 tensor\"?\n", "1. What is the difference between tensor rank and shape? How do you get the rank from the shape?\n", "1. What are RMSE and L1 norm?\n", "1. How can you apply a calculation on thousands of numbers at once, many thousands of times faster than a Python loop?\n", - "1. Create a 3x3 tensor or array containing the numbers from 1 to 9. Double it. Select the bottom right 4 numbers.\n", + "1. Create a 3\\*3 tensor or array containing the numbers from 1 to 9. Double it. Select the bottom-right four numbers.\n", "1. What is broadcasting?\n", "1. Are metrics generally calculated using the training set, or the validation set? Why?\n", "1. What is SGD?\n", - "1. Why does SGD use mini batches?\n", - "1. What are the 7 steps in SGD for machine learning?\n", + "1. Why does SGD use mini-batches?\n", + "1. What are the seven steps in SGD for machine learning?\n", "1. How do we initialize the weights in a model?\n", "1. What is \"loss\"?\n", "1. Why can't we always use a high learning rate?\n", @@ -4196,18 +4245,18 @@ "1. Do you need to know how to calculate gradients yourself?\n", "1. Why can't we use accuracy as a loss function?\n", "1. Draw the sigmoid function. What is special about its shape?\n", - "1. What is the difference between loss and metric?\n", + "1. What is the difference between a loss function and a metric?\n", "1. What is the function to calculate new weights using a learning rate?\n", "1. What does the `DataLoader` class do?\n", - "1. Write pseudo-code showing the basic steps taken each epoch for SGD.\n", - "1. Create a function which, if passed two arguments `[1,2,3,4]` and `'abcd'`, returns `[(1, 'a'), (2, 'b'), (3, 'c'), (4, 'd')]`. What is special about that output data structure?\n", + "1. Write pseudocode showing the basic steps taken in each epoch for SGD.\n", + "1. Create a function that, if passed two arguments `[1,2,3,4]` and `'abcd'`, returns `[(1, 'a'), (2, 'b'), (3, 'c'), (4, 'd')]`. What is special about that output data structure?\n", "1. What does `view` do in PyTorch?\n", "1. What are the \"bias\" parameters in a neural network? Why do we need them?\n", - "1. What does the `@` operator do in python?\n", + "1. What does the `@` operator do in Python?\n", "1. What does the `backward` method do?\n", "1. Why do we have to zero the gradients?\n", "1. What information do we have to pass to `Learner`?\n", - "1. Show python or pseudo-code for the basic steps of a training loop.\n", + "1. Show Python or pseudocode for the basic steps of a training loop.\n", "1. What is \"ReLU\"? Draw a plot of it for values from `-2` to `+2`.\n", "1. What is an \"activation function\"?\n", "1. What's the difference between `F.relu` and `nn.ReLU`?\n", @@ -4218,7 +4267,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { @@ -4226,7 +4275,7 @@ "metadata": {}, "source": [ "1. Create your own implementation of `Learner` from scratch, based on the training loop shown in this chapter.\n", - "1. Complete all the steps in this chapter using the full MNIST datasets (that is, for all digits, not just threes and sevens). This is a significant project and will take you quite a bit of time to complete! You'll need to do some of your own research to figure out how to overcome some obstacles you'll meet on the way." + "1. Complete all the steps in this chapter using the full MNIST datasets (that is, for all digits, not just 3s and 7s). This is a significant project and will take you quite a bit of time to complete! You'll need to do some of your own research to figure out how to overcome some obstacles you'll meet on the way." ] }, { diff --git a/clean/05_pet_breeds.ipynb b/clean/05_pet_breeds.ipynb index 82b8584a6..76bfdca06 100644 --- a/clean/05_pet_breeds.ipynb +++ b/clean/05_pet_breeds.ipynb @@ -14,14 +14,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Image classification" + "# Image Classification" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## From dogs and cats, to pet breeds" + "## From Dogs and Cats to Pet Breeds" ] }, { @@ -139,7 +139,7 @@ "cell_type": "code", "execution_count": null, "metadata": { - "hide_input": true + "hide_input": false }, "outputs": [ { @@ -182,7 +182,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Checking and debugging a DataBlock" + "### Checking and Debugging a DataBlock" ] }, { @@ -373,14 +373,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Cross entropy loss" + "## Cross-Entropy Loss" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Viewing activations and labels" + "### Viewing Activations and Labels" ] }, { @@ -606,7 +606,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Log likelihood" + "### Log Likelihood" ] }, { @@ -782,7 +782,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Taking the `log`" + "### Taking the Log" ] }, { @@ -944,14 +944,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Improving our model" + "## Improving Our Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Learning rate finder" + "### The Learning Rate Finder" ] }, { @@ -1161,7 +1161,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Unfreezing and transfer learning" + "### Unfreezing and Transfer Learning" ] }, { @@ -1360,7 +1360,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Discriminative learning rates" + "### Discriminative Learning Rates" ] }, { @@ -1555,14 +1555,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Selecting the number of epochs" + "### Selecting the Number of Epochs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Deeper architectures" + "### Deeper Architectures" ] }, { @@ -1692,7 +1692,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Summary" + "## Conclusion" ] }, { @@ -1707,35 +1707,35 @@ "metadata": {}, "source": [ "1. Why do we first resize to a large size on the CPU, and then to a smaller size on the GPU?\n", - "1. If you are not familiar with regular expressions, find a regular expression tutorial, and some problem sets, and complete them. Have a look on the book website for suggestions.\n", + "1. If you are not familiar with regular expressions, find a regular expression tutorial, and some problem sets, and complete them. Have a look on the book's website for suggestions.\n", "1. What are the two ways in which data is most commonly provided, for most deep learning datasets?\n", "1. Look up the documentation for `L` and try using a few of the new methods is that it adds.\n", - "1. Look up the documentation for the Python pathlib module and try using a few methods of the Path class.\n", + "1. Look up the documentation for the Python `pathlib` module and try using a few methods of the `Path` class.\n", "1. Give two examples of ways that image transformations can degrade the quality of the data.\n", - "1. What method does fastai provide to view the data in a DataLoader?\n", - "1. What method does fastai provide to help you debug a DataBlock?\n", + "1. What method does fastai provide to view the data in a `DataLoaders`?\n", + "1. What method does fastai provide to help you debug a `DataBlock`?\n", "1. Should you hold off on training a model until you have thoroughly cleaned your data?\n", - "1. What are the two pieces that are combined into cross entropy loss in PyTorch?\n", + "1. What are the two pieces that are combined into cross-entropy loss in PyTorch?\n", "1. What are the two properties of activations that softmax ensures? Why is this important?\n", "1. When might you want your activations to not have these two properties?\n", - "1. Calculate the \"exp\" and \"softmax\" columns of <> yourself (i.e. in a spreadsheet, with a calculator, or in a notebook).\n", - "1. Why can't we use torch.where to create a loss function for datasets where our label can have more than two categories?\n", + "1. Calculate the `exp` and `softmax` columns of <> yourself (i.e., in a spreadsheet, with a calculator, or in a notebook).\n", + "1. Why can't we use `torch.where` to create a loss function for datasets where our label can have more than two categories?\n", "1. What is the value of log(-2)? Why?\n", "1. What are two good rules of thumb for picking a learning rate from the learning rate finder?\n", - "1. What two steps does the fine_tune method do?\n", - "1. In Jupyter notebook, how do you get the source code for a method or function?\n", + "1. What two steps does the `fine_tune` method do?\n", + "1. In Jupyter Notebook, how do you get the source code for a method or function?\n", "1. What are discriminative learning rates?\n", - "1. How is a Python slice object interpreted when passed as a learning rate to fastai?\n", - "1. Why is early stopping a poor choice when using one cycle training?\n", - "1. What is the difference between resnet 50 and resnet101?\n", - "1. What does to_fp16 do?" + "1. How is a Python `slice` object interpreted when passed as a learning rate to fastai?\n", + "1. Why is early stopping a poor choice when using 1cycle training?\n", + "1. What is the difference between `resnet50` and `resnet101`?\n", + "1. What does `to_fp16` do?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { @@ -1743,7 +1743,7 @@ "metadata": {}, "source": [ "1. Find the paper by Leslie Smith that introduced the learning rate finder, and read it.\n", - "1. See if you can improve the accuracy of the classifier in this chapter. What's the best accuracy you can achieve? Have a look on the forums and book website to see what other students have achieved with this dataset, and how they did it." + "1. See if you can improve the accuracy of the classifier in this chapter. What's the best accuracy you can achieve? Look on the forums and the book's website to see what other students have achieved with this dataset, and how they did it." ] }, { diff --git a/clean/06_multicat.ipynb b/clean/06_multicat.ipynb index 6d986ec96..26c8de5cf 100644 --- a/clean/06_multicat.ipynb +++ b/clean/06_multicat.ipynb @@ -14,21 +14,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Other computer vision problems" + "# Other Computer Vision Problems" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Multi-label classification" + "## Multi-Label Classification" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The data" + "### The Data" ] }, { @@ -185,7 +185,7 @@ ], "source": [ "df.iloc[0,:]\n", - "# Trailing ‘:’s are always optional (in numpy, PyTorch, pandas, etc),\n", + "# Trailing :s are always optional (in numpy, pytorch, pandas, etc.),\n", "# so this is equivalent:\n", "df.iloc[0]" ] @@ -357,249 +357,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Constructing a data block" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "((0, 'a'), 26)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "a = list(enumerate(string.ascii_lowercase))\n", - "a[0],len(a)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "(tensor([25, 11, 4, 1, 7, 21, 19, 0]),\n", - " ('z', 'l', 'e', 'b', 'h', 'v', 't', 'a'))" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dl_a = DataLoader(a, batch_size=8, shuffle=True)\n", - "b = first(dl_a)\n", - "b" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "[(tensor(25), 'z'),\n", - " (tensor(11), 'l'),\n", - " (tensor(4), 'e'),\n", - " (tensor(1), 'b'),\n", - " (tensor(7), 'h'),\n", - " (tensor(21), 'v'),\n", - " (tensor(19), 't'),\n", - " (tensor(0), 'a')]" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "list(zip(b[0],b[1]))" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "[(tensor(25), 'z'),\n", - " (tensor(11), 'l'),\n", - " (tensor(4), 'e'),\n", - " (tensor(1), 'b'),\n", - " (tensor(7), 'h'),\n", - " (tensor(21), 'v'),\n", - " (tensor(19), 't'),\n", - " (tensor(0), 'a')]" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "list(zip(*b))" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('a', 26)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "a = list(string.ascii_lowercase)\n", - "a[0],len(a)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('a',)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a)\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "def f1(o): return o+'a'\n", - "def f2(o): return o+'b'" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('aa',)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a, [[f1]])\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('aab',)" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a, [[f1,f2]])\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "('aa', 'ab')" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "dss = Datasets(a, [[f1],[f2]])\n", - "dss[0]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "dls = DataLoaders.from_dsets(dss, batch_size=4)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "(('da', 'aa', 'ea', 'na'), ('db', 'ab', 'eb', 'nb'))" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "first(dls.train)" + "### Constructing a DataBlock" ] }, { @@ -733,45 +491,6 @@ "dsets.train[0]" ] }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "Path.BASE_PATH = None" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "Path('/home/jhoward/.fastai/data/pascal_2007')" - ] - }, - "execution_count": null, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "path" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "#hide\n", - "Path.BASE_PATH = path" - ] - }, { "cell_type": "code", "execution_count": null, @@ -914,7 +633,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Binary cross entropy" + "### Binary Cross-Entropy" ] }, { @@ -1289,7 +1008,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Assemble the data" + "### Assemble the Data" ] }, { @@ -1509,31 +1228,6 @@ "xb.shape,yb.shape" ] }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAACxCAYAAAAs/X9SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOy9aYyl2Xnf9zvnvOvdb93aunrfe3qmZx8OV200tUCkvEiGlThOAtkxYATxt8RAAgSMkQ/5EORDAAcBEkQxYCtELDuOFFsiFVGkSJEUyeEMZ3p6eqmu7uquru3Wrbu/+3tOPpxbtxRAHMCCZA+IehqN6r7Lu5z3vff//P/P/3lKGGM4iZM4iZM4iZM4iR/PkP+uD+AkTuIkTuIkTuIk/uLiBOhP4iRO4iRO4iR+jOME6E/iJE7iJE7iJH6M4wToT+IkTuIkTuIkfozjBOhP4iRO4iRO4iR+jOME6E/iJE7iJE7iJH6Mw/mwJ4v/6m8a0+8jrl2DwQAaDXAcRBBSfvd7CEchfA88DzodcF3Ep38O83v/EnwfNjfBdaFet3/HYyhLWF6G/X1M7xBRr2GmU4TrwsWL9vksgzCEOIYognbbPh7H9ifYx3o9u02wx1Cp2P0kCRhjX+84sLCAvn0HWQns61wX8tw+d3R8SQLTKbTbiDPnMQ/vIzodzOYm4spVcBzMg/vg+4jzF6HZhr1tu43zV6DWhCxBnL2O2byDvPIyev0dePoYrr+IaC5ivv7bmGdbICUoBUVxfH6+D1tbIMTxMR0cwOoqorOI2du1x1wUoLV9Xa2GWDuDGQ9hY8Nu03Xtc0kCtZp9bVna9507hwhCzNZTzMEBolq1x3/qFDx6ZNev1QKtKe+toxpV+/6lJbtuaWqPt163+9rZQU9jvP/1K+Iv+D79M8dvbvyGyXVOw2ugTYkUCkcoDIZufABA6ISETogrHaRQnK+d59F4A4BJPrWvUQGu8kjLFCUkgQpIy5RxPkEKSakLlHRo+y1ynSOFQglJoUu0Kal7dXKdk5Yp2mhKo6k4IZN8wlGLqxACT3qEToVC5xSmQBuNEg6udOinA7TRSCEJlE+uczzlURqNJz0cqYiKGFc6VJwKWZnhSJdhNqDptQAYZSMMmrbfxpEOaZkikKxWVilNSaACQqfKQbLPUrDCIDukG++zFC7jS5+t6VO68QGhE5CVGVIoDBpXutTcGnvRPo5UONLFGENSJtTdGp7yyMqMpEzwpD1mIQRKSBpeg0KXdOMuAIETAFDoHE96CCEodElpCppeE1/5DNIBcZmghMSTHkoq4iKer5XBMEiHhLNtSSHxlY82mkKXeMrFkx79dMAkn/C3rv3tj+w9DPBrX3rPxFnBjdUaO6OMVugQOJLAlXz93gGhZ7/KHSU4u1DBcwR/9cYq/+TdbTwleLA7xnMU9dBlseaxO0wQQnB2IeBJL2YYZTQrHqM4RwrBzdN1itKQloaqJ0lyTW+Ssdr0kULQnWRIITAGluouu6OMw3FKmpc0qx4V32Gh4pIUmqI0ZKXGkYLFmsv3N/qEnsJRklrgkJUaT0lKbTAGKr7DOMk50w65uhjwzvaU002PD3amXF+tMs1KupMcRwpeOFUhcCRbg4xWqLi5WMeVkrrv4DmSvUlCxXEYZTnr/Smnaj4Vx+H+4YR3t6c4SuBISVqU7A8TTi9UaFccHuxOkFLgOpLlRsBWL2KpEZCVGt+R5KUmLzRgP7eN0K5/nGl64wQAz1UErmIc55TaUA/tZyIrNEoJ6oFLVmi2DyMcJWhWPXb7MYGrWGmFeI7EcyS3n/RpVjwKbTi3WKXqKYZxwSTNWW0E5Nrw5GDK/mHEt/6Ln/hT7+MPBXomE4SUiGoNkyQWgMsSE0UUwxjvtAV3jLFAohQYDcMhVKuYokRIeQw8oxEEAUwmkGWIMICiQPj+MbBPJhZwRiPMs23E8pLd7nBo91OWFhiTxO5z9lpc1wKbMRYQwW5TKUgSTDFLEI7AXwh7Pp6HqDegs4jZfmZBOI1BCExR2O2evYR569sWjFstzGiAiCbw6qcQQRWxsIrJU3vRXR/O3cBMBlBpwLlLcLgHy2cRH/9LkEzh8T17LL19TBwjllYx/QO7NnEMCwt2v54Hw6Fd+1rNJiKua89Jyhnw5zYJAwvEQWDXSEr7frDrI4RdQ8exyVq1ap8PAvu479u1TFNwXaQ7SxrS1K5Tktj9BYF9rNUCKRHqoy0KFbpACkmoAqbFFEcocl2Q6YxxNqXqhnjSpdA5EkHghsRlRFbmlDOgFQhKo3GModQFSIekTDDGIGZ/SqNpuXWSIiHXOTW3xjSPyHVOxQnnIGeMQUlFxakSFdM5cGdlPgMjQTFLCJRUKOEghCDXBaUpEdjPca5zlHTIdY4jHKpuBWMMucxniURJru3nwFc+FSekl/RwpUPgBPN9nqudRyCoOjVyk5OWCUkRAbA+vEfVrVJ36/yg+wNe6rzE6eoZViun2I/38KRNfEqjaXoNDtPDGZiWVJ0KhbHgnJYpBrtWStivHCHseQgE0zyaJWGS0pSUurDbESWFKfCEhzYlSjgUpsTDoDE4QuFKF0e6ZDpDCYeKE9rkRQg85aGNQRtN4LhoY7+Y1ezcxUzQ9JT3b/We/LPEOLbXcpyWKGnXbpiWJIUmKzRV3wKkMTBJS9rKISlKuqOEZsWeX1FqAleRFJreOKVZ9ehHBYW24BOlBY4UhJ7DJC0ZRjmtqkd3nLPZnXBuqYqSgv1xhjaQFyWn2wFxZte1WXHZH5bkhaZSV0gBSV5SlIZ2xSXXhlFSUmiNNpI4KwhcNT+/auBy63SVwJF8sBcjBdzrxpTaEOUaz5VcX6zwrScjlLBJw/4kp+opfuHKElIIqoHDzihmZ5JgMKSl5t7hhKQw+ErwG2/t8O+9dopbyw1udOrsRTFZaRgkBadbPhfaPhuHKRXfYRhlnOtUcJUgLzX9aUo9dGkEit1hQakNSgpcJckKQ5wVgL23i1ITeoq80BhjWG2FFFozjnOkFASuohE47I1SlpsBSV7SDD2qvkOSlTQrLlFaUnEloefgKAlo8lKzNyootMZ3FOO0YLnuIYB69Uffxx8O9FKipzGq1oCDLqLRwAwGkGU4zdC+5oh9C2EBRpeIc+fg1uvw5d+ygBPHVg0AywjL0oKK1sfPVSqWwbruHOxFu2XB5elTC+qt1jHjHQ4tm6/X0WmOrNctIz51yr7nKBno9Y7fJ6UFtCMmH0WI8xcgrGJ2to5VC7CvAcQLL8GWZbvi0z8HP/gm5Bl87GeQZ6/Z12oNZQHJFJNnkCWY/h6itYzpbsPOU0y1AfW23eaNVzG6hDxDNhYwkyHi9ncxR+x7OrU/lf0QiOVlAEy/b8/Nde1+XRczGh4zfLDvrVYtGE+nln1nmT0fOQPlJLHb1to+127b1zqOfWw4RHQW7P+VsttqNI73E4Z2XX0fcfHih95C/67DlS5pmeIpj7i0LOYIZFp+AyUVmc6pOhWUVCRljCuXaHgN1qpr3Dl8H7AfXm1KDIaszHClixCCwAnQpiR0mpSmoDQaX/lM8glKOjPmbROLXBe40iEvcgIVUuiSXBd40l5PKQRJkVD36iipcISDFJJhNiR0QnKdE6gAT3l40sJdUhQ0/AaudJnkE8bZhLpXQwmFkvb9Hb/DKBshheJ66zqPxhtkZcalxmWqbgNjNEkZMcyGjLIhAHER83SyxfMLN+klPX57/Y+Ji4TLzYskRcK5+nkAam6d0AkptU0slHDIdDZXLo6A/Yh16zxCCglGo7FKhhJyvrYW7PX8+skZmCelZUmFzoEQY4xVBUyBg0uoAib5hHKWOOW6IFQzZcDMvoARFKZECYkjFKUpcKXLQtD+i7wF/1yi4jvsD2OuL4Xc68a0QofdccYo0VR8BynFnCnWfAvmADdWa/zS9RX+h288YhRnDKKMlWZguVdWMhI5gas4u1hlFOU0Kx5LdZcnvRglBQejlGrgsNIKSbKSh/tT0lyz3AzQWiAFTLOSg1HCQs1HCFBSsNWLuLRcJS80nZpHnM+Si4qHEhYcmxULYFIK8kLz+rk67dDl+1tjDicphXa5uhRSGpACPn6+wR9tDgldyV9/bo1vbB2SFYafOtehFtgkpz/N2JkmPBkk1H1FqQ3vbk9Za/kcTHLubfT4F65kqREQOJJLHZ9W4HKhWWGcFaxWAko9sueV+gzignrgUA3sfXyuHVBqg5SCojSU2t7DaVHSqnpEaQkUaGOYJgWNikuUFgyijMW6T3eYEHgKV0mUhLzUaG1wlCTJSxZrHo8mGUqA70qe9mNW2yHGwOEkJclKOnWfg1GKo2zCsDvKqIcut842fuT98+FAv7CAPAIfz4OwYoHaGOT5s/N/I6UFlOHQMtaFJUR7BZPnFliPwKJahenUSvUAnocejpGdjgWwZtPutywRyyuYjYd220mCyQuEO7XbOpLvlQLPQ3batnQwGiGWV60UKoTdd1GAEKiFpj3WILDHMRjY56XC7O8cs14AbSwYpim89mnM//vbiDc/gzxzFf39r8PLn0CevYaZ9CGa2ETh8R3o92asfxE2H2J6+7D3DHP/PkKqGZt2Me0ONBYQzUUr97sefPxzEE+gtwPVBmzctccVRVCpYfq9YwAPQ3s9lLLHfXQNajUr5zebmM3NY4YehvZ8sgwznR5fDymPWXsc2zV0HPu+orDvAQvwcFxKyTL73qMywkc4XGlv8UIXOMKCZ4llnEdSvTGGwhQE0gJDaUrqbm3Ong3GssAZa9TGsn1HzOTpIsEog5KWWRe6pOJWcYRikk8RQpCVVh0wxuDOFIS0TJFCYjBz1i2EIFCBVQuEQBuNNoastMqAQMzPySYOLkoqoiIiK7O5PA2QlimxKekEHaZ5xNnaGSpODYHkdG2Nqtvgg/577EZ7rFZW+M7OW6SlVTquts/zbnedjcEW3WjE/We7rNVqvL13H2MMZxtLXGtfoupWcKXLSrjKSrhKprOZXF9nO3pGlEdERTxLVAoL9kYTOPYcNWYuwTtCIYUicAIcoWbrLdEYPOkRFTEGl6zM8JRNjjKd4StNYTSFsWUSy+ZdpFAkRYIrXZuEKJ80T8mMxlMeSkgMmqzM/23flv/GsVjzGEUZe5OcSVpyYzlknJYIoTm3WCXOSpv8aEMjUPSigv0o4XInJMpKJklOmmsaoWAY5XTqAdMkZxBlhJ6iUfE4GFuw3h1lLNQ8PCUZJQVXlkLe3RoTeIrxMGec5NQCh9IY8tKQlxopBJ4jaVY9zrZDupOMl9dqvP1sYoVYbShKTaE17ZqPNgbPVbRCh+44Q2tbJnh3Z0qcaaQ8ktQNUVoSZyWvrzX4xmTA33njLO2qR91XXFmpUvUdvvn0gPv7CaeaHm9tDpFS8Gh7xNUzLZ4eTHi0N0Zrw2SSst2LeLg1xBj4YKlKLXBYqPmca/uUxnC2XuFCs0JSaNbqIT/Y7eNKwTQrqfuKXpTTrrh0xxntqkvVU+yNM7Z6Ee2ahxCwUPOp+A6rdY9xnNOqeEySglrgEmcF48SuoxLCqlNHKk1slQJXCaQQ1HwXJQXDOMNzJNlMwUnyknGSUw9c3FkpYX/8o+/jDwf60QjOnYPxEBGGMJ0cs/HtbQsMRWFl+FrdMtJ+z0rd3/qKBZEsswClFIShrct77lxKlosLx9Jxo2Efn04xvQP7niAApSg3t3HC3AJcr2dZv+vCdGqThcVFxLkLsLIGD+7B4iLs7dl9G2O3XRSICxcx+3uWwfq+fS7L7PlqbUEPEKfPwJkLiMXT8IVfRZ65DlIi/+rfxSRT0Brz/T+wQL9y2q7NoAdnLkBvD9PdRYz6EFYRN56DPMNsrFsgHQ4xSYpYWUZ86qcgqCKaiwjHg3M3bDIglYWZJLYKglIWyCcTe9xHcv7hoVUsytLW0osCM51iJlNEo2HXIc8xDx8isuz/z/6NscdTFHZdtT6W+6W0f80s6RHCJmK93nHJpFY7LpN8RCMtU9rBArkuqHt1W8NWVgrWs9p2ZrJZbdwy00Ln1L0Gm+PHlv3PpkTX3BqBCtiP9yl0OWeIcgZQxlhA8pWVoxOd4EiFQOA4FUbZyDJ45ZHpbF6jPjpOT7k03SaudBmkg7kyEDoBjnTnHoOKE5IUCcWMQWujyXUxSwpKSmOvbydYoOm1qLuN2c8WGs2V5tWZZD7iXz36Ko8Hh9xcXCPXOTuTIZ85e4vN4Q53ul0utQtuLV3k6sJpBsmYzeEu24MR37z3kCT+Q2r1kL/5+idpeI9Zq67SCTs4wqHltTldOcNB0rX1egzalHjSY1pE85r70XkdrcOc3QtJWmYIJDW3hic9kjJFCpsolNoCuiOs6pXrYn49KiqkNHq2PlYhsN4FF0/ZuvxRcnXkAfiox+E04/JKjTjXXOwEPO6n5KWh6SsejlIWG/5cfl+oOOTaEBclZ2oV/vuvPyT0HQJP4SjLplcaDu9s2npwLXDpzRj5JMmpBRa8hLAf/51hZpm3ECzUfQbTjCgtaNV8tgcp0yTHcySDaWZr/aHHrVNVtDEcTjOW6j6Hk5Rq4OIqSafuE2clV5dCdkYZhxMridd9yc7QIARobazM3fK4tBhwc7HGlU6Nf/BTNTxHUpSGV5ZbeI7kMMr452/t0j2IeO7SAsbAnYeHfOKFVbqjhO3tMVcvLbDcDPjk9UUOpwXvbhywvT1i60mPeBJTqVf47Kcvc2dnyq3TNYqZN2G54vPycgshBkzSkqzU5KWh4h2XLKdZScVTHIxs3b4oDa4jMQaivGQc5zhScnmpQlpqHuxO8B37faGUZDq2CsBS6JKXGiUF3UlOK3RwlPVBeEoiETzeG9Op+zQqLg+eDQk9RUU51BoBrvzR9/GHA70QiBu3YHN9xlo3bP14d9dK5Ds7sLpqwXRhCeEFkKVQa1mgznPLJodDxK0XMTvbFuSPGOWs5k2eW6YIsL9v1QHXRdy4CdMJ5uG6ZdlheFybLsu5DC+roTX3uS6i3ztmmdWqBaMss0lBkmAmYxiPMVmOEAIzGljQH4/t9oIAihzxU38ZUalDpYHonLK1cKMxe5tQb2P2HmPufwD9PjzbsseWpohbb8D92xDHFh8GA8TFK7CwCNMJ+t4DilGMjjOCwMd88C6i2cYYDV4AV55Hnr2OWLuM1iWi1rYJxXh8rKAMBhaQazUr3+/v20Ss27XXJY7t60ajuQIifN+uWb9vH1tYsOsSRRAE6P4Q2Wohlk9hjsoiZrbmYF8n5XGZ5ciYd6TCfERDCkXbb7Ez3cVTLlE+pRN06CU9Wn6LUTYiUOHcYGekoTRWMcq0BRpHStIypRN06Kd9W6+fSf5KKGpelay0oFKYgnE6mbFzl9PV0/TTQw7iAwpT0nCrCGzN3R6ftCqDSYhnYFR1q3OzWuhUMMxMP0iSIiFQPlERz8xwJUmRUOqCwhQYDFWnghSSy41raFPiqxCMYZQPmBZTHgzuc6p6ip3pDt/b3mI8jtgejbm1ssyz8ZiqU2VjsM9qrUY3ivi9R++xWqvx0vIl7h/uWxUjyRgMJ1SqAd94ep/L7TZv792n5vl85szHaHhNqk6NiltlMVjiyWSTwpTkZUapC1JjcKSan1dUxHMWHwhr9DNGU5pibliUQthzRpDrnNCpgPTm/orSlDhC0fJa9NM+hS5njN9HYg19Bk3drc/vj8KUCPPRB/pSG26tVvjy3UNeOlNnd5Rxvh2w3o24uBTysBux2ghohQGn6wGhK0kKTSNw8V1FlBZIISi14exCwGYvwXMkriMZxzmuI1mo+XPAvrIY8M7WmFbVw3MEv/riKre7Y763OaQoNNXAqlmFtnXjo/+HnsPhJGXLkzjK7s9zBK2qRyNwGMQFNV8xiDJ2Rhn9aY7WhjgteNLPGCYFg1lS4LuSvDT89RfWUNIy3EIbnh5G9NOMO90pZ5sejw5T9vanjEYJ33t3h4sX2hSFrYdvbA25fKHNzv6Ep9sjHEdyeqXGcJjiug5FXlCWJWE14L31A1aWquz2IzqNgIuLFSaLBaFSXG3VCRzJd3f6jBIL3nFWks98D7XAoeI77PRjGhWX/WHClZUao6S0trFSM0gKtLFlwLzQjGJLMM90qiR5ySQpcJRgHOe0ax5vnm3w++t9hlFGkpUs1H2aNasMAJxdrtGp+SR5SZKXMPM7/Gnx4UBfqcDyWczv/BacPXtsdDuqE/u+ZZhBANtPrPv82SZkMwZ47hw8eoSJE9jfs+xTSgsejoO4dAWSGLP52ALIkVxerSJ++vNW/n/vW/DgPk6rYvdZFLC0hO71kbUZ4zSG/Mke7tISTMd237u7iIuXMHmGaLagUoPxELP11CoQ9Zo9BteFqzfsuexsweufQbSWLZOXM9NbNEbvPAQhYeuhZdnLpxCdDtG33kUnj6h+9jWKB48RD/8ROsrwfu5nMJuPrFv/zns26ShL8oMxwlEEl1Ytq9/dxaSpPZabL8Po0CoGkwHy/E1MlsDF64jmgi0DjMfQ7yNWV+37PA8RW/Pg3HDneRTjBLdRt0lPv2/X7k/W6BcW7P9nJQu5uIB44WXMB+/ZJOJPyvOz2r3eP7AKzGhkEzyl7Pp/hKPpW8BJygSnsPVeKwEbBHLuGtcIpvmUtr/AOB+hhMJXPnXXYz/uUuiccTYmKRK00YQqIHRCqk6VwhTkuqDiVsjKnLiIqLoVnms+h5oxzp3pLgJb50coPGVr6o5QCCkwxjAtIqpuzRoAhWCST1jwFyhMgSOsk9xTLsOZc95XPmC/OBbDRTzpsRfvc6V5DUe6KOHgSZ9cZwyzPm8fvE2uczYGT/nGs+/z2srz3Fpe5kvv/zFxmuG9qni6tc/fv/0/E/ge/+AX/wrffnaXtCy5s7fP+7t7tKsVkjil0ajyxo1LHEQRcZ6zO5ngSsnrp26wPd3hQv0CTydPOF+7gCNdOsEipSkZ5yOMMQzSAcGshg5Q6gJHKVu/xyZZg2zESriEQBCVMUpYVqkxGI4Thbn5TrqcqZ3l6eTJvOwy3/ZMOZgWERUnRAnHrr0Q83X8KEctcNFAlBZsHqYoIRinJVobLrR9HnYjDqcZKw2fO90pL67UWD+MOIxTGhWXBi7dYcIoKuhNC/LSMs+FmjV7vn6uzjgtubMz5cZqlSeDlGJWf/6PXz2LMYZRlvOtoo/vK0JPkRWas+2QHwwTtElpVz2MMTw7mHK6HTJJNcbARjfipTN1plnJWqNCp+JQ8xT39qys36i4OFLiSsGtU1WWqi3ud2P+2s1VSm3mprdxUrA1jPiXd7oIAQejhC/3Y968tkirFbBxd4tiOmE8XMFow2/efgTA+c+/yupSlYN+zM72iI37O5w6u8je1j7nr6zx3M1VHj3qk+clkyijVQ9oVz2SXNNPMva1ZrUSIoXgxaUmy1WPnXHK1jBjb5hyrm2d75NE4bv2/pVCMEqsyjSOMlo1j7w09MapVSwwdOoB4zinEShcJdjsTqkFLq2qx688v8L/c//ACqpZiaMkWW7XU2szNwZ2RwmrrZCmryg+5Lv4Qy3T4rOfh8H+sTO70bBsXkoLBEfMfCYXIxWsrGH++A+sI/v0echzxNKiBYyimAOzuHwNlMI82bSgIaVlqouLyM//KnzwNuaD7yHOXbMO+LywCUJRwHhsW+V8H6TEpBnuom35MpOxVQnW1uDcJcTCEqQJ5tkT66p3XXs+Z85YRuwFUG8iP/4L8MnPIS+/gmguWUPdsAujA/Q3fxtGh/D4LhQ55v4d2HpsmbTnUKQ5yR/fphhMyfdHZPsjzOMNxBufJPvhPcRLrxJ99S2K9U10kuN2auT7A4pRbIF4fx/x6iesRJ9nmHtvYcZ9zLiPcD3k1Vdh+bRNVmateUZrC8BHxrqytNdn1sHgrrQtYI9G9mK67nHr3RGbP3LdzxIA+dpnj9WVSgXxt/4eACaxiYi8cA5WVuz7u13Y2ED4wY+6fT4Scal+iVE+xJu1e1XcysyYNjvnGbvWpmSSTyhMTuiErA/XZyxSoo2m5bcpjSabOdk1hqpjmfc4mxCqgFLbbSyFS1xpXOXhaJ1xPqLm1Cxzl87MD1CSlTmutHXkrMwojDXqOUKRlfncUe4rH4FVFOIiISmSeS274lYJHJtweNLjdPUclxuXqThVHOEwyYc8HN1jfXSPX7/zG2Rlxt2ebRvcGY/59va73Ol2qddCOq06H9zdZDSKcF2H7uGQf/z2N3l55SJvfbDBL998jfdub/Dg8Tb7+32ajSrvPd5i4/EOg0nE+v4Bnz77AsN0wv70kK88+X12prs8Htsv26bXIlQh2ug5MMOx+96RLqXRVl2RDoEKaXkNpFBzA52vfAIVoISk5bfmLn4l5CwJUCwGS3MzX2kKXuy8OPc6hE6Ftt+i7tapuVVyXTDJp/Nk7KMcf/u1M6wfJDhSEGcF7YrDk55VQR/3U8rSUJTWnT7NrCK11vD5P35oE8yDkQXutYUKaWElZt+1ic615QqlNnz7YZ8z7YBRUrLdjzm3EPKffeI8/90frPPgYMKlVo1q4CKFZddJXrI3Sgk9K0O7jqTQhoWGT+BKorykU/e4vlLlXNOn6lnvwHu7EZuHMY3QI/QUN0/VWWn6rDZcrrarfOxMh5+/vITn2MRuf5zyva1Dvr7Z5R9987F1/g8TCm0wxvDVt7cZDhNc32Xt+kWGT7eIxhFu4MOwy1f/4AOGk4x772/xxstr6FKz8cFTPN8jy0ref2+bfnfAdJqxuzvBdxVPDqbcfjrgSz/Y4RuPRjweT8lnbYCXWzW0ObJGCQpj0Ma67oUQaG1d9lIImqFDJXCJU9sBUWqrenhKUpSaM50Kh9OC0JVUAmuq9BzJxRnLH8UZgaf4L3/mCoXWFKXGdSRXVutcX6mx2gp5vD/h9rMRNe9H38fqi1/84o++u6LBF1l/H1GtWOnacSwYB4GVwbsHiIWFY/C++YqtWW9uQKOBiKfHju3JBBNFCMeBtTVEowmRNYaJdscy/moV+Yt/A/O9r2Fuv4tYXAIlYNBDHBnDjuT16dSay4YjTKkR9RpicdECTxzZ+nwyRejSJiFHpYFZ/7yoVhEvvQHnryFf/klAINorFgjTGMoc8/53ME/uQ60B+zuYJ9skVSoAACAASURBVBtwsG/PaX8X8ZnP4oSC8M2XyO+sc/B4wPAgYmtrTHM6wnUznBefxzx5THz3CU8/6LL6sYvE93bxVhpED/YoeyPci2vQP4A770GvC6MBorOMCKoQTxH1tjXo7W8hHAWOQlx7Hp48PpbpK5Xj0kYUWaXlKI6MenluE4tm87jVTinbffCzX0B/+TdtDb5SQZw7D/tb0DtAeB40GpQPNtBbu8h2A72zh6iEEAbIT/zCf/Nn/wr7i439ZOeLw3RI3atTYmu7ts3OtsjFRTzv2/aVT9tfYFpMmRYWAJIymTPPpIwpTYkSik6wQOAE5DrHlQ6+4zPNp9S9OufrF7g/vM8kH1tZX2cWzHWBIx0cqVBCkZuCXOfzEoCvPKpuFSWVTTp0gcagZswTrKnQYMsLoQroBB06wSKrlTMYNBVVpTQle/EOg6zPV558jfd792n5dQ7iPu/uP6MbjdHG8PDwkP/8zV8mqBd8+tpz3H62xdbuAdM4Ie5HTJKEp1GfN65eYGPQZTiJefB4m/NnV9h61uX0mSX29wc4SnFupcOTUZe3dp6w3uuiyTlVW8B3fIbZgAW/Q+hUyLWVKw2Gtt9mnE1IZmY8IZgxcXt+R2Y8ZgzeVwGZzjEwnwFw1HJY6JLrreu823uX0tiWypbf4jA5pNAlgeNj0PO+eSltmcAYQ6B8LtSvfGTvYYDH3eiL6/2Iy0sVRqnGcyS9iXXEj5OSrYMpq23bP+9IweunWiRFyYNejOdINnZGnO5UaVc99gcxSV7iOYqry1UqniTXBqUUZ5o+jw8Tqr7Dr712hn/4ew8sc1/w8ZRke5ISZyUIQcVX+I4iSq2UfTBMUFLSqLicagY0fMXTfsreKCWdGcyiTDNKSrSBrCgBC4aLVZefOLfItZU6xhjqoUOUljzsT9iaxPzOnQPu701xlGSSFGztjen2IsrSMJlkvPzcMsurTRbaIZH2GT99jI4mYDQlip3tAWvnV0hyTbtTp7uxSV6UxHHGmfPL5LmmyAtWVpvsH0RM45xn2yOMhIVGgDGCbpxythHOlQvPg0LDJ862eHd3QpQWloHnJSutACkEvUk2e9wQePZz3Kx4c1OdoySlMbhS4ijJNC34O2+c5X/8o00G0wzPlbx6rsn64ZT9iQX95brP248OeXIYIaVgOM2oBi6OI/nZ64t/6n38odK9Gexj7n+AuHDJAsNohIkTxFEftxCWMTabiJVTtp1s/5ll3WmKOTyExUVEpYrpdm3v9uoqIqxANMHE0axentnXvfxx9D/7dZsYLC3BaACeb9lnp2MBamfHMtGZiUzUa4ilJWuOy3MLVFojrlzFdPdtYgAW/MIQlpcRzbY12l26hfArgLDM3hiYHGImA8zD27C3jYkmFkxnDvXyzj1Mrhne22Xxp38eUanCS5+gojWD//Y3GGUFS3WfB/cOOTP5LpVmSHBxkcYXPkVjOOT2P/0O1z5+FvXaqzQuX6S8t269ENUa0x9uULl1AXHjBcvsD3fACxDNDqLSRPz0L2NGPfi9f2YTqtJ+4Gi1MLt7FpCVgn6foj9F1QpEo267HI5aDms1a8xrt4+d9XEM1bpNYCoV2zb3+k+iv/Iv7NqdOQPb26hmbT7MRzZqdt9x/Of4dfbnH/30kKSMUaJKXES2b33W925b5OR8wE1D+VScCmmZzNu8sjKn7tXIdYGcgXHNtQw9LdP5YJvClCwEC1ScCu8cvEPVrRDMhsoETohBsxQukuucqLAytEDMlYXQCWds3rJ7T9r9REVMaQoKXRIXka3zew3qqs5SuEzb65CUEUkZ4cmA0pT00i4HSZfv7b7DBwfbFFqzN91kpVqlF8f0Doa02nW+9Z3b7L70E0zzjJ+/+AripwX/9T/9P0nHCaLiMBhNWN/YZn1jm5vPXeD5a+f4pTdf4n/6v3+fT7z5PM8vLaGk5MnmLkoIKo7Ds6f7XLt0mnONRTaH26Rlxmpl2Q7gCVc4V7tEWiasj+wsiUxnc2PcIBviSBdHKKaFdeuLmXciKxPKWclECMEgHVJ1KwgkmU5Jy5RUp/N+eykkS+EyT8ZPAGukHGYjfOXj/AlDIzBv3fsox53DEf2oYKXmMphmbHYneLP6uqMkSgkmaY7B4dpSSKFtDznAKMq5uNqww2lmju1OPWCx5lH1Jfe7MZ2qy3Y/RhtYa3q8vFrnH/7efVZaIa4S9KKCqptRGlhsBCzXXTZ7CbnQZEVJq+pTD13SWQIBcHc/wlWSK0shW4OMuq/QxjBJc3xHcWEx5GonZDH0Wa0FHETpfEZAWmgeDSasH8bc2Znw/noPz1MYA0oJBoNk9tVX44Pv3GZlpUaSFLxybRHv1ip/NJoy7XbBcSGNMXnGk9sPWDh/htVTLa69+SL3f/iQs5dWKYqSJEpIooRirU2j4fH+D59w+vwyrUbA+vYIIQStzOFdf8BzusHNlQbXdJ1pusv2NGZvYNsRF+o+e4OYrLAye2+cMJqkhKFLPXRJ8nLe3bDS8Pnu+gHnl+vzcx7HOVmpqfoOWhs8V/LxU23++d09AK6v1Li/P6VV86kGDhc6FbrDhNBTTNLyR94/H8rozcO3v8jmQzsEZ1bnFc5solsQzFvkbDvcKuLMJSvbH7W3HcnLo5EFmIMDRKcDaYIIqzAaztvgxI0XMd/5uk0IHMeyzIWO3Y/vIVzHGvW0tmz1qCZdFHb7M1lfrKzO+sNn7vlqDfqHx1PjXBf1K38PsbBin/cCzOGu/dl9CmWB2XoAW4/Rb30fef0m4sU37GCYfg/14kuomzfZ+/J38D94F+fiaUSWgOOw8rk3WDzYokwK7vUjblxbQijJ3vt7VN2Ssjdid+OQU88tI+oVdr/0NfqbfVo/+Qr0+7jLLcQrr0Nv3zr4kxiGhxg/gHiMqLURYRUTBHCwCw8f2PWNIsuug8AqLIMBGI1s1MHzKA+GSE9ZwN7ZgSy3a3rugl0bgO6slc/3ET/zeXjrG7C3i+kPELWqde6PJ3a2ge/bfTgOlCXys7/8kWVD29HTL0ZFRFqmZDqf924D+Mqj0MV8Il3VrdLy2myOH8+m0pW4ykEgZu8yxEUyc4An855xPZOca26VnWiHuIhnQ25yy9DFcYVskk/nxjmNmU9x00bjqwCDoeZWZ8BukyglFYWetYkKhRSCVxbfmA+n8VXIo/E6Na/O+uge42zM+uAh+1GPx4M+r506z39w8+e52FpGyJg3L17ltbVL/OsfvMXtvSdUGhUG6T5RnvBLr3+MoV+gC814a8DHPv0CwsD99S1c32Vj/4Bnj/dYPb1IqxLyW7//XQ72+7z03CXGWca5lQ7PLa4QFTGPBocYk7EX9WgGNZ5NnnKqsjbrTPDpZ33GmS0txUWMI128mTP+aAJhxbVmxKRMCZRvOyfyaO7i7wQLpLMkINfWYOtKjzO1s3TjLnEZExcRSipKXc5a+3xCJySbDfJxhMPV5o2P7D0M8Lv3Dr64M0wJXMXeKKUReriuJM1KGhULsI3QpSgN59o+K5WAbz4ZIATsDWNaNTtZzZECR0n2hzHnFis8G2QEM9ObNtAMXV48VeNbT0b0xilSSiZpwUrDQ0lohw4aeDZI0QYGUWZLAEBe6PmUu9zAhYWAcVqy2YsIPYdO1eHJYYLrSAJXEWWav/HiaWq+S5yVhK7i2TgmUIqHgwmTvOCH21N8VzHJSi6uNbh1vs355RpLnQq4kosrDW5//y7DROAHHloKDgcJZ893qC60GfQjdBpz5fUXKIXLZDBhPIoZDyPywy5Ovcnyco2ndzcxvS1ka4myhIuXlzi7WkdKwfbuBCPhcJJxph0SlTlN16VZcVkJfbYnMVuDFEdJorTAdSRKCpbrHr1JxjjKaNdtu93BOJ0lP1Ue7E9s54gS3FqrsT1IMcBhWjLJSiqewxduLPGNp4ds9xN2+jELNZ9CG6K0oF3zCRzJ3jBFzsyKf+XWyr85o+fRfcv4jkxcRy1WcWxb5QYDC/btNoQVTDSyw20WFhCdJczWEwvOS0tweGgBPI4ti1fKMtmlZdsn/q2vQRBQDmOcysxc1u5gHj+0AN3rHU978zwL9lLaWnwQIFZPYz64jdndsYN31tYs4z1qrcsyxLWbts6dRuBXENKazERYxexsgOthnj7E/K41H4ozp6HegNvfh7Vzdg2iCebxQy78wi27/8kEs/cO6QeP8a+fwzvV4vFb25wPffJhjApcmk0PpxGSPjsk1oYHX9/gSmnY78asnaqw9+v/mubpJsGnX7F1emPg+ouw8wROn0f4IVTqdkaB6yPPXkc7Ljy8b49pfx8uXICdHeuYz3PrYVhbg+EQVfPhwgWEUhjfR/7Mz2He/q4tx6Qp8j/8T9Ff+l8Qn/tF2H4KXoDJZka/Rt16JHqHdmrFzIBXDieoTms+1OejGr24R1KmM7lczsbFunNXu21Zc+f963ERMcxGuNKl7beIy4RM21GyWZnZQTSznvpC5zOG6OAql6eTLcvidY4oJd5sqM4wG6GNBW5PemQw76MPVUAjaCCQuNJhnI/pJT1r7nNCXOnOavnWcb8QLhCogGk+nvsH4jyi5bd55+At6m6djdEG/9sPvs5aq8Ekywgdn9+8/2XO1BdZP+zRCqbc2d3nr33uE6xUq7hK8bX1hxweDFnoNCnLku7hCNUJGfTHTKYxrutwfnWRd+8+BuA737vD+PqUIiuot2p86Xe+wYXTK7x56yoAxhh+9bmf5qtPvssrKzfRpmS5skwv7dJwm1TdBqcrgmE6RArFKBtRmQ0FGqRD0jLFkQ41t8okn+Irn6VwyU70c3JOV0/zcLROUtpJhC8vvsw7B+9wo32DdMbQj5IFX/m40mWc2XKWHUtsiIqYulefqyof5Xjn6cjWqyd2nKoQUPftLIFpUuA5ing2TMV3JEoIbm8N6dQDfvJahzt7EXFWUPN9xnGO7ypGSckozmlXQrLCcGutRt1X/Kv3DzjVCpgkBa6yzvyLrZA/2LATONPCtoA5UmJCl2FkpePlhk8jUFxcCPjGwyG/++4uzarHpZU607Sg1HB2IWAQl7x4qkLVVRyMLYu3znQ7r+IPn/QojeH29pQf3u9aeXqY0K777AwTeqOER5sDOp2Q3/7BHT75S59hpRWy3PD5J//XD9Fa44c+1bqdFkmesHH3KVJJiumEGy8+z+1vvw+6pHv3Lt3tDkRDqC2w/84PqF2+ge87xHHB6lKVX3jzLN9dP+DyaoPuNCdwJZujCGOgFji8sbrA/X17zz3qTgk8RW+ccjBK2D+MqFTsGowTOwnw6modJQW+o/j7n7zI//7OM8azWQF/981z/PpbW/wnb5zlyTjCnWGv50pW2yFCwE4/wncVjrTlrijJaVTcD71/PvQON73usYlOCHR31kMdhrNRqxoTzcbWnrlo64izATeksa2jG2Pd9kdT18bj43qxUnDxGmZzY27UkxXP/tt1j+fbj0ZQryMuXYU0JX+waQ+w2bQqQ1iBevN4Yl6rdTz7vtFCvPIm8gv/PvIzfxl561MQVO0kO2HrgUgHU2SYO98DoxGXLtnhO+cv2vJBuwNPN2wL32SM2XpmvQWvvjFrLVzAv3kRFhdRz13j5ReWSQvNBw8HCCU5PEzIuiN6Gz00hkfjmGdvPWUrTslyze5exHhnZFvyHm+AH8Kob/vz776L2VrHbN7FDLqY4QEoB7l2BfGXfskaIdMMnj2b97vrND8e/zudYrLCjtk9PLTmyOc/bhOUx4+g2cT80e/OZgw8b2cgfPW37PU5mkLY79sSiZRW3h8OUdXArn/5o+Wij0LkOqfQOVmZY7Cz14/Gox6NiE3LjKy00+pyY2d9g60MJ0Uy78kWQmAw81awI9OXHXc7xZEKbUoc6cwG21gDYFxEdriN8vCUndTXTwdUnQpVtzYH+YpTmbNOa8IT8xntbb/N5eYVztUushqu4Ssr0xcmp9AF0Wwm/w8PblN1KvzijRc432zyyzdep5+MWAhrDFL7eyGUlCRxSi+OudE5z9ZoxCtnT3P98hlurq1wfW2Fc2tLlFnBe+9vUAl9xtOIb//gLr3BrKtle8rDxzswzmg36zjKoT+aoI3hvf0dqm7I3cOHfO78J/nuzns8HW/zzWd/zNZki83JI7QuqHstXuy8PF/bXOcYY+vxR4mXO3PLH63J0e8LaPsLGGMYZxNc6fJ08tSOQHWq+CpgffhgNqfAmffRG+xgpNLYSYVHEwaPBgx9lCNKC4SwDNpRgu1+RG+c4juK3X5EqTWHk5QoLegElr0raafOTTPNwSghzUoOJxkLdR8BDKYZp5oBSWHr55faIV9b7wN2jK7nWHhwlWSa25YuY6DmK55ftWaxrd6U0HNYrFuQbwQOSoh5n30j9GZDfByWaw4vrtT4wvVFPnV2keeXmjRC21ufac0wzdmeRuxPch71UrQ2rCxVeenqIp969TSlNjzdn+AoydJSBaUklXqF3d0xoyjjD3+4w9KpNpevr3H67AJh6KIcBfUOuigoixJ6T7n9h2+B60G1CekURj1QDn5nEcqcSffQjpPO7RCi97eGvHapwwdbA9b3pvzhgz639yY8HUdoY1ise/xHr5xmOmu3O1qzWugShi6r7ZC6L5km9vvmdNNjZ5RRaE2z4jKcZtzdnbLaCvj21sC263kO5+tV/vFbW2SFoeo79vpNc9o1W44utGZvlBH6Du3qh3eOfHgqe8SiZ/3UcnHBgndumaBoNizjA8TSGczo0IK5lHZO/NEvYYHjWeq1mv2Z54grNzDvvWXBQwhEs2l74meDXszO1qz+P/vFLt1du309m3l/NNteKpjM+swXF62icOMW4vmXER//WeS11xFL5+zrgoodgOK4mOEB+ul99Ht/BNMRdFZse+D1F20NGyzLv/MevPC6NRjWG8jnbyKev4V57x1IEqKvfNsmO3t76PvryNDl5vOLrC1WMKXmbnfK9g+3ydIST9jsdTLN+YkXlilyza1ffI7mFeszIAgwd+9gHj2wLYuOC5MR5vZbtl6/sAoIcFzkpZcQP/mziOvX7JrO5tjLtdlsgzjGTKbIpY5duzxH3HwBBl17PcoS+Su/RvH9txGf/xXM1gPE0mkr7zuOVW0O+1Zp6Q2s0382TthoDb2eHYn8EQ47EtW2bUkhqbv1+WAZRzg40p2Z5BThDGilsMBbaGuWs0No/j/q3jTIrvS87/u9Zz/n7rf3BY0GMMAMBrMPh8N1OBRJSdRCUZEsyYkTR0pSVlWqnMpS5YorlUzsVOx8SOJyPiSxHcuOEzuRRIW0JJoiRYocLsOZAWbBvjaA3m/ffTv7OW8+vKcvrCqRKeuDa3KruoAGum9f3Htwn/d5nv//988RaHiGWyjdPSy9CHIJWmqfjMAzvJla/jjIxtZtSqaHIVTwyrGa/1hNrhd0PD/1EUIU3nqdqlWdCQSb9hxls4qpWeiaUaB6Te4M7/BO5x2+d/BGsc/PuNS6xrnGKZpujSRP2Kyt8ebeNp9YewlT16nbDj9x/iy/+Pjz/P6dd6nZNt+7cZf9dp+d0Yh3726TZhkXLpxibq5Gmmbk45jxNGA0mYKhQcMmThJOPbXBaDLloy9f4PTmKp5psuB5fP32Td45fMg3t99QVqNowjfuX8fWbU5XHkNCoYQvcbZ+jrqleAzHoUPHu3gFBsqoWhWiLCLMIipWWXnnhTpYPTv/PHvTPR6rncVPfZziezUUIe+YSAjKZeEaLkmmRIHjZIyffrB1JsAMtepaBpWiOGa5pDeJODFfnsFo0kyyWS+xM5kWsBVJkucz8pqU4Fk6nqPup2Rr6AI+vF7ldy630BAIASeb9uxnrdYVoKfq6JyaU+Pi9jQhzXOyTHHyq45Oliu4ztVDn2mUzuhwFVvZx05Vy5xplNlslnAtFbAzjVLCJOPd1pD3j0a8+VBNXSZhwl5vymMrVXRdwzJ1Pv5Yk+k05iNnmjiOyfpCmcfPLXDh7Dz7LTUGHw197t7cYzyOaLcGxNMppXoVu+RiuzZ4NbRqUzWihgm1ZbBLlNY2iAZDHv/sp1k8tY7rGliWzp2tHjv7Iy5tdUlTxQy4uzOg5uislFyyXBImqmD/+vNrfPyx5oxHZmgap5erVByT7jRVh7CKzd4wJkozXjxZI8slC1WHTEp+88Mb/OBul58/P4+fZDiGRpLmOIayHo4DdRA+Gqr3kKpjqJ8fZ9zcHdCfxj/y+vnxo/tjEptpquJ7dKQKyLG/GlSRXlxElOrIa5cecdiPjuD06UckteFQcdGlVOr7U48pmtxkonzzq+uwXOyQSyXE2SfVnl0I2HuInCQzat2Ms1+Ac+SghzAtxa1fOwm1OUR1DtFYVMXddsEoTjx5CtEE2dmD2jwMclXco0B1s7vbSlkfBIgTp9Be/Anki6+qMfdnfk7dT5YgyjXk3g7iw5/AA7pfeh3DEFg1F73iYs5VKE0iSj/5Ms6Vf87mv/tZxPwip05sqp83nYDtUHn6JeTbr2O7niL22bYSHkqJ7HWUz79cQTz3snrsYaGmd5TVTnv2VeTKafKv/GM1LQlDtSopwntEpYw4cUKN430f8ZHPqQ7e8yCOyX/3H6LPNxDLm8jeAfnv/SPFTADodBAL84iFBfRKRU1mAFxXrWEMAzn9YL9JuoZDVrzRO4bDOB7j6PafoqEdB6NYmk0/6s3+fBgPWXQXmCSTWXhL02moBKo8pmyWlTq8wNMuekvFiHiMrdssukuM4iGpls5Y87pQqntLsxAIgizE1AyEVAXfM0pIcuadBabphIalOldXL2FqilqofPED7g7vcKp6ilE8Ymuwz43ONk8vnuZoOuHvv/d15jyPD6+c4SfWPs2nVl/B0iyaLzTwCqCOIUz2Jx1eXX+Ziv1dvnrxCnfu7WKaBgvNGuWyR3kS8G9/6uP8zb3f46//yhdZLs2zUTnB3nSPbjCk4VR5ceF5vrX7Oo5hc73zgIZTwtJ1TF1nGIbMeR4LXpPffP7zxHlMKzjA1CzmnHls3WXeWaJi1nivcwk/DdCEoGJWyAoevqmZ1IqDgJSSk+VNHk4eYOs2QepztXeZqlWlbJYZJ2Nud29RsxVhsBf2KZslTM3EM1yFOtZdAqlcOLrQCPPoX+MV+ee7LVSdGSEtzZXQq15S3bImIEsllqEU73XP5F4vxDZ08lxyeWfEueUK99uq+B8OAp47USWXiur25JJLN1AgorWGw7kFh5WSy7X9CUtVm4+cqBKkGff7IbeOAlZrFt1pSpZJqiULTajUtmmczcbwdc/iyZUSqxULS9NYKbm4pk6jZM0mBd1JzMORzw8eDvnIRpXrR1MmkQrQWaq71Es2793r0Kg6nF2p8rPnlvjCE8sYmuAj6zXmPVsJETXB//LWNq+eqvO1210u3+ty5dIWeRzRWFnELTn0jvr8xS8+z//5pZz/+jc/zlLZYqNSYhDF7E9CypbOk/M1vrvTwdAE7+1NcS2dzigiR+KHaid+sunw0xfmEQJ2Jz4LmU3Ts9A1wUrD5RdLK7QnCfuDAD9KeaxeYhRmmLqg5Jg8s17BENCdxnzqxBzfediZJej9ne8/UD+jVuJwEvJ/X+3wxGqFHMnBMGS16XFmwWO17rDbV1bLkm1Q8ZQ2I0p+9HT1xxf6MHyESD06Un73PH+0t/c89XW2/SjUprBrkaaIk5tKODYeK+W3W4LJSI3V/QlyRyliMU1YWkO+80M1KTh7Xo3idx8gL72lvmZ9fRbQIppNBcE59o835mBFdezaySfU10sJhgVZApoBSAXyyXPy1kPwxzDqIVY2kafPQftQceefeg7KNcT6WcWiN21lc9NNRHMFbG8mBNT+wn+AqDSQG0+w8FO/xOQ/+2sIy8D+uZ9C9jvM/yd/HRlO+Ylf/vcR9SXkoIWoL8FLehGyo6uP8x9VgkdQv8ahesxpBKatwnImA8gzMB1kMEbkOSCAFLF4AvH4BajV1Y6/UkMOeqow6zryqGAhmCYymKhDS5qi/cpvkP/ObyE+9grywTVorijRpec9en09D5lls8OeKMA/Uqj0v2z6wX6TzGSuEutkSlDAUo591bpQoTNxrhC4WrF71wpC2/Ho/HiUnGbJLM1OFyo5TuFUcwzNRCC4P9oiRzLvzs9WBb1QHR48swQoC9myt1Qw9A1ymeGZDp5RYhD1cY1SkbjnqhjaPJsBX8IsIMljLrUvYes294ZbXGheoB8OmCQ+URrz+dMvU7cbnK6epmrVMYRJnCuu/rnaeUzNwhAGQmj8h8/8e5TNGheaT/FXn0v4tS/9F9iOxV958bMcTjt8/gufIyfnr/yt38AzyoziPlWrwctCV90fAiE0zlSfQJKjPa4Xh58QU7NIpYKGhJnP1uguju5Qs+pM0glxFpHmKZZu4+oeJyubDOI+YarWKwYmfjpFFxqDaKBWIeT0o96MXnih+RSXu5d5rHaGYTzAEKrRyPJslhdg6RamZiornlSipZJZIixEfHH2ozuhD8rNKZLQojRj5CdsLJSoOgb7g1CNvw2NPJeFsh2OJkXaXdEFWoagUbYI40wp79O8CJcR+EnOWw9GbLcnzFUsSqbBl6+3mYQpz66UyCV89/6Ine4UU9eYL5kslk2iNGeloWAthi7IUWPpmq1zOE54rOFi6zq6UOErtql2ymGikvHCJOd33jtksebw/uGEF1YqHIwUMnccpvz8MwtsVE7SdC2aBXBm6CfommC16lIvWYW4UPAffWyTqmvw9HIN8akz/OdLFbrjkF94fgXP1PjExjwA//EnNjF1DT/OaJZMJCVe0o6TFOFXG+tkueSXn9bIpSRJlW8dlIbAj1Iu7fdJc8l62WMcJySZZBKm2KZG3TP5pfNLfHe3RxDnrNRMbh+FdCYxuhBsdUJcUyucIzFhmqMJ+I0X1/kn7+/zhacWuN0bY2ga7VFI1SlhGhqebeBZusLraoKFikXdNShZOkdDhFwR0wAAIABJREFUnZAMP0p/5PXz/826P+7mm83CY16gT5eWFDwnyxCNOWXFK9TwMggQp08jw2C245dxqrrlYw9+p/Moyc6yYHtLfd5swvIJeP9NpD+dKcVFowkXXoKL34HNszC3rIqvZYNXQdgectRBjvuqeOYZYqUJuKp4pimkMXJwBO1dFTpz+xZ87mdgYb1wDpxA1BfALRCZUiqRhqYjO/vI1kO0pz+pCq5hIZY2QTcQlgv1Rcr/699DOGV1wLAcMJR9C5krzsD8OugFtCZLFWnvuOCjqeIuc7A81H6hqn41LHXYEIo/L2xXHbjSWL2EMkX70GfJD+7DD/+Y/I0fIE6sk+8fon/oReRRC3HyNMwtoi1vwhd+g/zW28hb7ykh47nnkHfeh4c3H4kYjw8HjhqDyjyHdlupz4/dFFmGOf+jE5M+CDcVaqLCa47V3kEakMkcXTeKwq7hGg5BOlXFHcEkDVhw5mdBNglKse+nfmHBiwizAv5SUNgmybiw7RnYukM37JBJFXATpD51q4YmNMaJiWd4CBTi9ngMn8uM/nHOejzE0R3WShtIKHzlEcNkwP3RFqNoyt7kPt+5d49ffWaPZ+cvMEmmOIbDhcZT2LpDUjgCdv0HANwf3edi6wq/eu4XyWRK056jbs0BUDbU6/jlX/nvCwiNjqXbaEKfuWjSPKFmNdGFTs4jVLBAcbglFM+fhqt7hdffI5fqIPRk42kVu4ugYc+r1Lw8nI3hl91VDM3g/ug+vahHw64TZzGL3iJ+4lO36xiawYnSKQzNoBse0Y96GJqOo7uMkiGTbELNrimRYho8OoygxtjHHX5axOIamqmihz/gt9YgwDZ1GiWblYaLVXjS/Sil6irqna4J1usWu72AKMkxdEGU5lxYq6o8L1MniFRB2BtEbDZdWuOEh92Q7jhipekRp5KDcUSS5ZxaKLFScvneTp9myaDmVtnuBazVLFYrii3hWSp45mTdZrVqUrMt6rZJ1x/SCWKCJGez5rFSd4gKDvwkTGlNIq52FGL3oB/wvZ0B22fm+PS5BrfbAecWKnzy5EIxLcg4GkbsTwIGUczRNOH12z1+/eV1yqZB3bFYazikucSzDdIs57/5/BOUi0OPZagDRprJInJWTUHU9Qp+Ef17/JFJoGDaG7pKBdSEQBdQsg1ePb3wp5C8aa6Id1LCwE84uVDCMXX+r6sH/M7FA55cr3M0DPjMkwvc74Y8v1amZpk8u17nhRMN9gch19pDyrbO2UaFu/0xrWnEueUyWS65e+RTcgzqrk5SCDF3e2qS6sf5LOLX/DGR4eJ4d/Vn3ZLf/LwEEI6tCnu3q3bg3a4Se0mpMus/9Rnkg7tqDH90pIRqzz6PHPZV8e50VMjK2qr6XNfV2L0A7bC6quxa9brq2E89pvYn46HKc48iqDURbhnZ2UfMr6rIV01XBVPTEYaJTBNVyHOJsGzEyhnVmeaZItoNWqoAJ7HSE5iWKvrNFfBHaOdeVH8fBcjePvLmOwqUc3iIePZFqNbhwR145iPQ2UeceUY9FqE9mmYITXXhs+dVPrIbJhGYjvozoalpQ5HChczV44wj9RiK3HPy9NHhQDMe3ScUgTzBo+lFMCZ/908UMGh/GzmZID75k4qsJzTwqn8q0EZ2diAKEG6ZvLOnRHjwKEvANBGeh/R9JXI89syXy+o1mZuDyQTjb/6TDyws/J/d/cfymCRXscpEWUTFrBRENJWMFmURJysbhQJcoxv28AyXhtPAT9SId5yMibMY1/AKuIsSj00Lq9eyt6T0AEXRbzrNYr+vRGSa0JBSHRTMWchLjlFoASTHgTgOo3hIO2izWd1kvbQJgJ9O8NMpD8b30YTOgjPPIB4w7yxwb3gXrYhd/fjyKxjCoBd1OPD3+d7+m/xg9x59P+DVU2c4UV3mD+68w7/z1Ge43r3Lh5ae5kOLL6MVfnRH92aWs+NbLh+NBDNyVdaFKuiyAPrkMicnJ5c5fjrGMyozT/txGI+uGegcI0IfuTWCbIoEbE2J7Q6DPQC6YYdpMuVC82nqVhNAHRSEmIn2wswnykIMzaQdtOiEnSLKVtELkyzGNVwkknGsxIjHDIVj7n2Sp3xu/Wc/sNcwwBf/wUV5XNRNQ2PkJ5xbrdIahjimItMFccYvP7vE9x4OSTLJQV9lun/qXJPDseo877enBdDFJYozlXyW5oxD1fk/faLG0Tim7ppM44xXTtXQheD+IJgV9HGc0Z4k1Bwdx9TIJcy5JmGacTRNOFV38QwDUxc8GPqcbZQ5NV8iTnM645hpkvJua0jPT5krmfT9lE9s1Lm4PyJMczYaFj9zboU0l/QmMYeTgN+/2eHtG0cAzDddDF1j73DMFz+6we2jKecWS/zShRVMXRAlOSXHIEpUrvuxvkETYiaWU8E0Ksb2+DmwTY0gzjAK3O4oSFkswoKEUDoJ21T/Zl0T5BKcotvPpNqVS8A21HNyvz1ld+JzuxPQnib8+vNrrDfVRNHU1aHiOABnFKgJhyagM475f24eEiZKWzEuhJEVWyuAQymdUYRr67OgmwsnG0RJxm/9xWf+zOv4x/vov/7br4kzZ2aYVnQdceas8l47jgLZ1OuIJ56Bgx1IEvJOr0i4K3C1/b4S2tkF5OZfPjAcC/uKMT+DvtrPS4qOMYelNcT8KsJykVmCWD2jrGb+CDkZIHKJHHXVAaA6h1abR3hlNSLXNTXKjqbkty6qUb1XQXYPYfuOGtWvnobrlxCnzyNKdUhCZDBC7txRSvtTj5P98IcILYeHWyRvv4+WjJS//a3XoVlHVJqq4CYRMpyqLh6p7HCarjrvJC5G/kWHnyVFR69B7KvzwLCt4mwFj7r74pCioqQydbAB9ed5qg4ORsGxtz3E0oZaRzSa6B/7acT6ObVuMIwiqyBUP69YSYhyXf2+Ng+OiUgiRRdME/XaySICdzqFtbXZIUCcfXymodBe+cIH1oN8s3/1tQV3gUymRJnqzsuWUskf29ZMzWDJW2aSjEnzjFE8mmWVpwW9ThRivpJZwtRVPrqpmcWO3SywtBpxFuOZHgiYxBMkCsgyjIckeUI77FC1qliaxfZkh340IJc5vajPUdBmwZ3HM0usl9Zp2vMIBJN0iJ9Oef3guyR5QtNu0vIPuXR0haOgxVNzT/L1h6/z0tLzNOwmUR4ySobcGtziWuc+f/mpn+VrN99mo1nnanuXS9e3OGJAJlO+vX0V0ww4UT6Bn06ZpGNa/j5ls0wiE4ZxT4kUZUaYB0XOfWGxJON46OmnE3RhcGd4g5JZmokfc7LZNOLYDpjI5JH6vUgONIoUPhXqU8HVPRzd5nT1LK5RKuYsopiqTPDTCY7uzlYmQgiqZq3Ywyfomk6UxUUUsD4DILmGi4YgJ6NpN5HFoflU9ewH9hoG+N33Dl57ZqNOhqA9DNW+d63M4VDFzCaZpOKavLIxx92+T5DkdMYhcZpTdk3SHPYGwQyxahk6G02Hhx2f1abHYKrSFNMcTjZdru+PubBaZhxndIOENJfcak0Zxzm2IRgEGWfnPdbKLjujkK1eiF0gcHeGESdqigz3WLPMYsUhSHImYcowSrh0OMLWBatViyiV3DicEGQ5mw2bi9sjLiyXWSo5tIYhvTDmrb0RNw9G/MzzK/zgSovF+RKTIOHqO1vsjTMwBLf2Ruz6CU/MlZjEGcMg4UZ3RFlXgr/9YYhr6oRJTncSz8SMahyfYeiqxT8axWia4O29HlXbRJ8RKSVBnBMl2Yy/HyYZaS4ZhylBnFF2TLTiAGDpgqprslJxOVFxePXMHM2ShZTqvsIk53AYcThQXIHjQ0iSqgPIU4sVTF1NTKZxTmsYoesaEiXKW28qKNLAj/ncUwvF9EHj808s/JnX8Y8t9Pm176u/tCwl9KpWVUpap63e+IXyVWvPfRT5zhtqv24aCMtSLPu2ovkcd/9iRe2AZbenGstSSd13pQI726qAnH9aFZClVZhbAN2A0EeU64gsVcK64y73cFt54jUNsXoaYZiAVLYJ3YA4QEY+8vL3YW4ZUW2qHHjLRqxsgj9SCXXVumLZexXk1R/A9YsQTNXK4Yffgywjb3fRsoRs6JPttdBrKv5WhFPkRAXRyN4BHDxUfvM0VgcJ3ShmRJqC8XT3EXmmunvdeDTVSCLl69cNVcwlqtjGoXIIjLrqvo4RvYW7QQkj46LwZ+rn2K7C+VpOsfMP1MFo1FWFPpyq5yoK1N9pOvn9q2hnnkGcfVZ17MMOdDuPkLphqMb4c3PqWrAK+6Pvo336Fz+wb5L7/s5rmUxxDZcwiyiZigPvp37RgUoEggV3kZbfUsVGN7B0m5pdnUFrcilBKEyunwazyFRNaJTNErKYDOTk1OwaQZHBfoyrHUajgseu4RgOk2TCvDvH1e5NcjIqVoXHamdmO+YFZwkhBL2oQzs84qsPvsGSN8+8O8eit4Rnesy5De4PtplzG8y79Zmw75/f/0O+fOfbeKZJ3S7zT6+/ztQP6SYRYZoy8UPa3SG5pWMbBqkMeDC6TyIjWn6LTthllKhMbz+dommCpFhh+OmEm4NrRHnAOBlh6XZxsIVh3KNpz+MYHpMiGGgQ9xjGfWzN5ig8xNB0TM0iTH11nhUqxjeTGTl5YX1T+31Hdx+tLbKQ3ekDDoM9ZPF1oCYdYeYjhMa/2P5DTlVPs1raoGrViPOQUTwqwoTygoxnUbEqRHmMZ3gEaUiSxx94YM7FvfFrACVb52AQMl91qDg6nUlCbxKhCTWi/uy5eb51t0ecqiAU19J5Yb3C/a7irFddJeg0dEF3krDXmWKYGmXboOQY1DyThz01xVqpOUjU21l7kmIVbPy6Y+BaGjXHYHsYUHcN9kcJDddgzjN5bqmGreuUTIPFmkOUZOwNA272xvzvb++RSTUe36g75EhWazZvPxiyXLPJpGCjbpNmkv/h9S3+5HaHqw8GPL5e54d3OhzuDwiiDNPU8P2U7lEfzbQoly1SKXlzZ0CmSW52ppRsjTvdKQgI0gxXN4iSnFzCMEr4ys0jwiwhSnOOh1aaEIyChEXPwTV1DicRlq5xr68CeCSwPw5IM0nZMdnu+ziGjmVoSGBarEamkbIy2qYCXEVpTpTm+HHGO/t97nYn2JrSL0RpztBPiBKFxf0Hl3Y5VXO5sFLjzFwZXeTcPFJTr3GQMPRj6iWbpYrFKMxwLQMBDMOUL1xY/FcH5oi5BeR4+MjXHseK+24Y6vfVYj/rVVQhWF2FVgvZ6yMmyhc+C0mxbfJr1xDlMnmcomeZKiJRpLj0a2sIz4P55RnVDqcEpqVy2uMQyvVZcZTtXcTyBjKO0BbWC893EbAiJURTZDBG7t1D3r0Bly/BiU21Cig34OpbyF4HqjXlV6/PqWJ/dIC8cR2aTfJ799HOP47W7bL3B5cAWDi3wO6VQ9ayHHtjgbzdRtN1ODpQcJvN88id2+pgoZtQbiCnA4RuIPMi5nblFKJSQBqMIv89z9ThJSmEeHmG7O2D0BDWmpo2hGM1ftf0QiQZF0W+2JknsXp+oDjkBArxW1gJMSyloWjvIkxLCftGXeT+Xbh5GRbXoVRDnHlOPZYoUkFAcay0E6UScjR6BCzKc8Ty8r/C29W//ptXFNtpMao/jnk9FtOVTLtA29oz7/Y4GROkgaLpZfGs6zse6zu6Q5RFlAwP2yzP0LQVs4IuNEbFiNjSrFlcqq3bRFnEcmkZXeg4tsPedJ+PrrwEwFppXT0eo4JEksiEUTxgd7rDIBpyf3DEu4fbnJ9f4sPLCXW7xjcevM7loxYtv89GdYkLc48zTkYMozFhltELhnzzzm1efewMo1qN3/72W8Rxwom1RW7ce0h/MOaJ85u85e/xsQ2D0dF1ztRP8NTcBW72b6n8eN1i3llgd7ozYw/c7N/h2fmn2Kyc5uHkPsvuykykOE0nJHlMJ2qzqtvcGdzhRHkdS3dYcJbYnjxgo7xJKhMsYRNm/mw1YOtuQblLCbOAUTxknIypWXXmnAUO/APqdh0/nTKIBsy7C9TMGtuTh7zbvsyNzjbnG09gCIOyWWWjfIpc5rT8I/ICs1syVdqghmCcqNepan2wdSYAixWTSZgxjDLqJYsozUgyta/VhGCh6hCnOe1RxCRKWKk57PUD+pOYXpDSHqlY2vZI4VLbPZ+Ka6Lrj8KF4iSnFYdsNF1ao4hBkHJjf8TLpxskucpnj1PJOMo4O++iC8HJmktrGvGpTVXcq5aJoQnKjoFlKC59fxrz/Z0+QaI64u3OhHHZ5sKSx3rZ5bcvt7i/N6Q3iVhteAwi9T5u6hqmrnFiucK33trm1GaDlz+0wZsXt3lw+TZGpUp6uM39yRT/zAm6JYuN9Rpv3Oux1izxwmqF+121Dliv2fhJytXOiKptcLsT0BrHzHkGS57LpVafJ5rq/54mBKbUOJgG7IxCclni2tGUJxY8FjWHum2xPfYpWQZlU73ftkeKjBckGc2ShR+lJJlkrx9wFIRMkpSGbXK6UcZPUlbLLuM4oR8llE2dhm3zxl6f7X7E4SDgYmOAqWusNRyeWanzYBDw9vZIiQN1jZqtAoI826Dvp6R5zmbzRweM/fiO/vf+t9cwTdjfVwI711XHuzieZc2LpWWlzj7cVcWm21WdnmOTbe+hlQplfhCQdsbomyfQ4kh9zeKiotgdK7ofOw/VOcTyBqLSQNTmoHeodvC2+6iA91tq5KxpaAsnVJRrXowRLQc56SPvvQ/BBAYdZb17+kVYP62EbbU5xMlzina3/wBCX3X1aao6bXKyqzdJuhPygyP0lQU8I8dIEm6/36JcMrl/u0v7+iELmw3EoAeei1hZh34LGovQP1LK/jwtsuwlTPrKC1+qwXSoinsSqT8HhKapycD2DWTkI0xbAXwMCzlRY3SBVNoCX71JqccrHx0Oxj31PAiB7O4XQsSi4zctdZ+arr5/3EVefgN59V1EpYo0dfXz3Io6hAQTuHXjEV3wWGsQBOrwtriI0DS0D//kB7YbutJ77zUpJcN4SJwnuIaDoRnEWTzbny+485iaxTgeYepm0cULymaZbtibjavDNCTMQhpOgzANZ7haPw1wdQddMyibZcIsmBHbPMOjFRyRyYym3cDQDI78Nu2gw5zTRCBYK60TZIHi8AuBpdkc+Lt87eE3yMm4P9yhbDm8vPoEz8w/iWt4zLsLvLD0LC8un2GcjGgHA+bdOlLmeJbD3viId/b3OTrs4RuwVqmwtNREsw3u3d/j5Noih+0B92/usLw6z954RJynfHL9ee4M77FRWef++CF+6jNJxiBgmvgMoj4Np8FqaZWdyUO0IitglIxIczU5GcR9umFXdfyaiaYp5cJhsI8udCSSilljEPdmEKJJMsExXDQ0dqfbeIWIr+W3Zoeehq3YFjWrjq3btIMjBvGAH+y/zXuth2hCULUtptmUFW9NrRuygKPgCAGUrXKhDZAqTCiLKZklJHCycvoDew0D/P03dl7TNY0H7Slxqro4Q1dhNHHBl39po0wmJd0wwzU12uOYXEpsy2SnO6XsmgggSjI6w5DH12oc9NQ4f3PeY6eniGv9aULJVgV7rekyjjLW6jbb/YiFksl8WQngDqcRrWnCvGdSNk2Wy85MmmQWtreL+32+fL1NLuFBW/ni15suZxY8lbVlmpxd8Di5XEIzDGxTJ0olNUdnksJ+z6fVmdLa6+GWHUxTJ80FC+uLjIc+uV1CJjHTB3cQ1TnanSlS03h2s8b1ls+ZeYer+1Me9iPuDQIqts7uKMYzNUqWzpMLZS4fjVkoWcR5ziBKGEVJMaHLifOcIz9Wga258rQfBSGeaRCmGSXL5HCirIxRlnHoh5QNQwn0wgjb0BjGCQ/7ETujiN1xwKl6iSyHeddmHCfsjSOkkLyzN2a3O2UapaQISo5yFwAg4c2HA4QQrDY94lxiFGCiSZhwas7DNjU+car55xjd//HvvsZ4rHboloU4e069wUeR6iIrFcSTz4JbQmQx7O+Rd/vk0xitWUMztEeIVMNA31hDPPsh0rcuIfMcbXlJhbE4jvLRLyj7nrBsZBKrYiRzRLWhxHT+WI3rdR0QasesawjDVAVMCCW0u/4m8tq7iF5bFafFFbX3djwl2guU1U/EIdgWnHxcFUNNR379D5h86x00yyDpTmg/HBDe3cdbrqF7FvgRg0HMJM1oxSknLI3WlX3cYITe2VMj+FJhI2wfqkOMzGHcR5RrSmwoQA47CKEhSlWEV0EITY3WZY62uKG0BFtXihciRTSWkXu3EXOrauR+/D8qTxUTIFFUPOFVZ1Y92dmHvS21onBKyO6Bmi7A7HAk71wtJish7D6AlROI2jzCtNT0YdCbWfNEpaI6+Th+lJjX76O9+sUP7JvkveGt15RyXgnyGk5D7eo1DUmOYzjU7YaKKhXKOz+IhmQywyiKkiLeKU/3krvInDPHgX+IUdDs2mGbslmialWJM2W5S2XG7uSAqCDxLXvLSKAf9dke71G1y9TsGo6uxpeGZuAaLgKNYdLnK1tf5b3WNofTTtE5rRaBOiUadoPDoAVIdie7VOwyp2sn0DUdQzP4uxf/gDffv43nOaRZzq07O1y8dR/TswmCiDzLGY6n+AMfUoluG1y5uoWwdL6/f4OYkLrr0guGtIM+DadKLiWXO9c5VT2JJJ8VSk1oNJwmFbNC1apzf3wPq/DILziL3BreZppMKRke66UNbg9vsVHeYJpOZqLEIAu42L6IEGo9Mm/PK9eAjNmZ7PCdnTdBqEPQzmQXU1fRtGGBJ96btvCTkCTPebf1gNVKhc3qKUzNIs4j4jxSNkWZ4RgOfqImMBJZcPVDHqs9/oG9hgG+crX12iBQIkPXNnjuRJUozQmSfJYY99RimdWyR07G3U5IexQSxhnNso0fZSriVNeIs5wn1uu8slnne7c7oAk2FzzutSZUPJMXNhTqdRwkZFLQHkcFvChno2FjGRrv7E3oTFPWahYrJVdJeQrFhmXojMKEt/Z7/MGVNlv7Q5JcaQjutcZkEp5ZKbFadrnemYCQPOjH1B2DqmOgCXAsjX/6zS3u3tgjR2MynHD44JCdnT6Vepl+d0KapCTj8UyQHAub6Z2r5OU53nh3h0qzRJAKehM1lau7JsMo49rOkM15D0MX7A5VzK4Q0LAtljyHmmUxjBPF788kZ5slrrSmCAGOKTg/V+Vya8R61SVMlWI/yXOmScrvXTnCdqBsGaxVPTQhsDWdO32fb19tITWdFMn7hxMykbFe8eiGMdv9iEmU8+BgRJblHHZ9Mt3gwkKZsm2QJpLr3Skl26QzDjkz79KZJgRJhpRgmzr7o/hHju5/LBkvP+rMirqcTNUY2St85FKqPX2loTrMs0/DYEAepugVZcsClPfe98naPbWf/8F3yCYhelPhU/MwQXzsFQW6KcRpctBWO+Ks6ISTWPm/pQSvrIRjgPDKqpM9FqSlibKY7Wyp4uWW1PfevFyI8gLFcJ8OkIcPkUc7iLUziMYCor6o/l3F4z641gIh6PZCru+OuPOde3RvttA1gecZuJpGSdO4eqNDnGSgCXa/9j5icUWNvPceKgBQuYZYPKEIe5MhorGgRHPLm1BpIP0xMg7JDx8qW18ckW/fVDjgCx9FWz8Luon0hxBHyIMt8t3boBvI7r7yCjeXH43jj3f+Mlfq/84hAPLKm3C0p3z/O3fVBKQ6h1g7qVIJ79xVE429rcLHryv8r+epUX0cq3RAKRU3Ic9VZ/8BR+CGWUSaZ6QyI82TGbVOZdDnRcysiaEZzDsLDKMhAEahEjc0nYpZIc2VqGwYj9id7BFnCQJBnMfYuk3NqhFnCQf+AeNkzIPhLrmURSqdQZD6TJMJVavKyeoac05TgXbMEp7hUbMaGMLkwN/jZv8Go2jCgufRdEsEacS3HryvLHhIdia77E322B7v0An7nKme5vH6EzTsBgKNtUqFZrPKjVsKFe0HEZPWiNt3duj3xzNYUKnuYTc9tndbNJoVdF3j3ffv8PzSGXKZc29wwKnaGp7hsuDO8+Hl5/HTgHlnnnlnnsfrj7PoLjJNJuQy5/3uezTsBmEWcvHoIu3wiCcb53lu7jmG8ZBRMsRPA97tvMPV3lWSPOHe6B4CwVPNC9wbbqELnXEyVgcBYbLkLaNpOuN4yp3Bfe4OtjE0g/c7l/EM5b3frK7R9n0eDAZowNZgl2kyJpMpo2SoxtJFPG6URcXrrfbNxxjjD/ptr+fjRylRonz0nqlRdVQTFRW+9JWSSpr70EqDo2FAlKjOv2TpNMs2z27UmEaK0LbVGvOHNzpMJjFlxyROVeOwXHW42w7Jc8UgOBwEGJrAszTVnYYZPT/l06canKhb6ELQDhTn3TE1mmUF0DmYBvhxzmLN4cRSBdcyOBqGxHHG+eUyuqZxtT3m6v6UG62AXEqeWirx9FKJc/Pq4DA35+F4Dr1WD9MyIYnJ+0fsbh0yHoxJkxThuOj1eXCrZKM+lBqMB2Omu9skaY5raQz9mM15jxyoOzrPnayzM4iY8wweX3B5brFKwzEZJym9MOb7u32CNKPtx1xr+bzXGvHqZoOX1+pMk4xBmNALUr51v8d3t/sMooTvbveZJClffGqRw1HC7iRgd+QzjGKiLOfppRInFsuMg4Q7bZ+H7SlJJnm/NWK17PDCagXb1LAsg/E4xijWLKMwZRym3OiPqLomhi6USyBTr1fJNkgKLkKW/ehr+ccWemEUHlrTJAsSKJVVtnsUqTH9cbc67MC7PyD3Q9KBYm7T66kucHGFzI8UthbU/ema4qSPxxgf/TDi9FPguAXjvQfDIm1uMlQissERolxDOJ4qmKGPqM+rDnzYngnS5KgDO0Wi23CI7B4pUV2SIE6cKwJsylCqK8FbY0mNtMd95MED6B6CbWPU1brh0vUO01xZpvwo49ruiGs7I3Z6AWGe89LjTXaimMNByOsQbXssAAAgAElEQVTfvEO1YiEvvom8ewPRmIMkVpbAO8UaIQ4hDpUFcNQDmasCbbmwe4/8f/5vkXkGh9tkf+e/JN+9Q/7gujoADTtQrimU57V3yH/v70GpimzvIdMETIt86wpy3ENmKTLLFGOg1oQ715DXLitI0fVLsLKJtnIKee1NpS3odtWBa25+Bh5SOGOFPsb3VbE/PFTFPc//FMr4g36TyJli2yt861meogsDswirAXgwfqAOBlJF0qYFex1UgQAwNH32a9WqkuUZj9XOsFHeJM5jeuFgFs/qGjZbw23aQY9RPGa1tIomNAbRkHbQYd6ZV1a4sAtSEmQ+B/4B77SucTSdcKfX48GgR9v3yaTkTPU0UkpKpseSt0jdrvPc/NNMkgkPxw958/Ad7g638EyTKIypVUrcuLOtmOALFUquw2A0YW+vzaA9wg8jnrlwGjoh/eGE9394k4VmjT9+cJXXt6/wWGMVWcB/rnavsz89wNLNIu/+kCu9K1iaxY3+TTKZcaN7h1//0t8izCLafp+/+rX/id3JLu9138PUTB6M7rPkLuKZJS61rvKffvt/pGZVuTW4ScWscqK8zjvtd4jzqHAqDHANh2cXHue91n2+eu99Gk6Fr9z9I1ZKyyw4i1ztXmVnfMA0jgmCiKpt88zCE7iGR1LkzZuaSZqrA+k4Hs9gSCowyCD5/4GP3rWUhc6zDYI45WTVpeenTMME19IxdY2dsc8gSPhnVw5IspzhMCKXksNhiKELBblJVGdfdkyiNMNxDFYbHj0/4eeeXeYvPbuKoQuyXHLQ99GEIvDdPfIJk4yDUcxa1eLIj7jfjTgcJyy4DmXLYFwAW8ZRyqW9Cd+80eH6dp/twzHbrTETXynan14sI6VksWTyU080+dSpOp853WScpNzq+nz9Vo+LOxPyXKro2H6bYGcLxh2wPdyyS5ZmJMMB8uAe2bDLwplNNZV1SkTbd7AXVznq+HzjrR3Wmh6tUczppk3XT5nGGafmHHVwiVKutkfMuzY3jnxsXSNIcv72714nzSS6EPyjb9zjKzfa/Is7HYSAN3YHnJt32WhY3D6c8je+dJ0nFjwOxzGeobNWs+hME/w0I0xzjoKQaZLymcebPNgf8eaVQxZrLt+43sExNRq2xY3OlMNBQKczJc8lnmtyYbWCY2jFtARqts4kVDHoWx1F3styyXzVwTK0mY3wz7r9+EJvmWBZxLtt9abvlZTiejKZBahw9SJs3UZu3SObRGiuhZhrPvJaW7bKMQf1Pb5SdJIk6qNUVkKxrLCRpYkSxmk6YqlIjNu6iXz3e8jpEDqHyrt+/BgLD3x++xJy5y7y+mXk9WsK9nOcXmcYyKtvFZ57oTzzpy4g5ldVody+raYART67ZpvsdH2GWco0y/HzjP04Yd4yaJoGOZKSrvP9m11WLJNEStbLDoNBpA4dw6GCAwmBfPvbKus9iRHVObUT/85Xka2HCMuBvQdKJDe3iPjkpxGVBpSqiPNPQq8FD+8gD7aULkE34cpbUCrDhz6JvPpDtLUz6r6cEtq5F5V2IQ7VRV+dQzz/ipqCPPeiesJsG5KQ/OENNe24d5fJ5W1kmivh3aiHjANII+T+jur+JhP1fIJ6jnxfvXbTqSr6H+Cb8oSrLlGlmRlkeUacJ8o2h2CcjGgFh3TCDmFR0O1CsCeLsXrJVElYURYXwSqxOjDInHE85nr/GkHqY+sWpm7yWOMknuny/MLTeKbDzuiAb+18l07QZRhNuNC8QJgFeEaJZW+VPX+HP979JncHD7jdO+L6QYswTSmZJuvVOivlMm8fXWJ/us8oHjGIhqyX1vEMj73pPpc719kdK5/xKIpoNKvs7rRgHBMOffyhz+67D5n4IdV6GTSoVUq8/bV3MVbKEKbMFfQw5VGH7+/eIcpi/uHVL7FWXsEzPCzN4k92vsffvfQl/GSKRPJguIejO5yorvA3fuovcapyihPVFX7miaeYJBPe2H+X9ztXOfAP0YTO9/feomJ5/LWX/02+v/8mz849x8PJQxzD4VOrr1IyysRFep1reHxk6SM8t3SKj6xtMo0Dmm6NcTzh4eQBk2TCd7bvcOPWNlma0fZ9tsd7BKlPJlM6YWcGNMryDF0zitcxUijjLEHywb6Gj2+2qbNzpBDYUZbTmyaEsUo9EwIOpxHvtobsDQKmYYrrGizWXDzbIE5yKpZOvWRh6hphkhHFGe32lFxKoiTn1lEwOyRoQuXOv3CyxlrN4rOPNzi/XGK5YvH9ByO6fspK1eKnzyxg6xpzJYu5ksW1oxH/x/v7bHd99g7HdLs+WSZZWyiz3PTYWChzp+/z9u6E9w58tnqhEnEmKbfbAfe7IUM/oWyrA3WpWlKTSaGpyW0aM77yJvl0rCa2poPRmKf97ltgudB+CPMbRGFEHKcsL1d468ohfpTyW9+6r/5dJeWt//0rbb78bmvGGNjuBli6RsnS+K9+9SleWmlQsjWefGyemmtwc3/EpZ0J01gdGt96OMbUNf67X32Gd/YmfGS1wcE0xDN1PrbW5ERZNYzzjo1j6GxUPJ55bI6T6zW6Y7VW2eqGvLHX52Ev4sa9Hq29HmGY0un6bPeVvTBOcy4f+AgBIz+mN1GNSZpLxkFCkuVqNfPnLfSYJvlwjDlXVmS0fRVBi+PMlPhyOEAe7isPfRAjY8W1p9dT3eCwB3NzyEwikxQqFayFihLyfepz8PhzKjrWMJDv/ECJ1WrzKlwlz5RVbmlNfQDimY8j3LKyj2k6JBH57XdUhvt4iKjVoFxGbGwgGnOIT3we8cnPI154BVFtIm+/D/sPyd/7rrLdlWrqIlo7iXznLajXyaOEhmOSSrA0QSKlEhNFCUdxwiTLGaQpqZQEec5WGHN/HLB8YRlx8iTZzj7B995V4JrHziO/9mX1eyGgUkNU6xAGqrOfW0QebaOdegpx4WVka1tNSc4/rw4lTzwH/Q6MBsib7yBe+Xn1eejDyiay3wJ/otT4RYEXlQbC9pRuYdRDHh2ogCChId+/BN0W8o0/gX6XrDtAGDqaa6q44UwJBGVcrGymUzUhEUJ18PU6MghUh3/Mvv8A3yzNKhTypcJiNcA1XEzNKPzdMUEaMIxHKgAkU377aTolzIIiEnZagF/yghhnUbUqGMKg6TQQQjBJJmwNt/mjrXepWmUWnHk2yieQ5DTsOq7psOA1qVpVfv7Uz1A1a5wobzJORmyN7vL20SVu9x4yjn02601c12apVGKhVOEXzvwkv/bEF3hp8UUeb5zjwWiXS61rfGXrD3mz9TYr3jKe6fDyytN89e57zHsecZzgVTxwdIySDX4KVQtyyag7hlQy6E+gbJK2pzBO6O50eezsOh9dP8OtVpv3rm8xiX3+rfM/x99+43fYHu/RdJqYusl64bjpRh2enDvL7nSHjy9/gs3KJpe7l8llzssrzzFNfX7y5Cs4hk0/HPHt3R/wy2d/gXHsM0kmPLfwFO3wiGE0pGbVGSWKN7DsrVI2K5iaye50h6Npj9vdQ+a8Ot9+eJNxMua3b/2ROrBlGZ5noxs6856HoekkMpnlEwRZOJvEAAXsKJ3Bkv5leM8H9eZaBsNpTKPqkKY5t7s+J5sObkGCqzoGe8OY2+2AJM2ZTmPiOGMcJIzDBMvUuHLozwJUwgKW89T5RVxL4WmbJQNdgzDOuLUzYKnmsFAyaboGfpITxDkHIwXTeWLe4/OPLWAZqsg/6E+5eNDj67d7bLXGxGnO+kqFJMmxbZ3uOOQvPL/MX35xjQ8t1/iJ03V0AZd3R3znQZ/LrQln5lRn+sLJGhfvdDBNnSwr2CGVpir4kQ/zJ1XmR+8ANJ00TlUGSRort9C4y8Zja5w/M8fe3ojDnTaebfCFl0/w+vsHXN0b8+rGnEqJcwwWyyY3u2M25lw6QcRLqzXSXHK1O8TUNM6vlhmFKf/G80uYusbDrs937g34tWeWqbgmcZ7xofUKB37IIMw4UyuT5Dm9MGap5NBwLM7UymwNp0gJrc4U09AYT2PGYca3b3aKbbjELbvFW61JmuUkec7OyCfJJdM4x7UNwiRTj7tqE8QZjqlzNAxmLLQ/6/bjxXjf/NJrwtAR9TpyMEbz7EdCPCmVtWp3V30ex2QjH73sIgwN4hjt2ReQ/a6Kd41jtPOPg2WRbO1hXDivULearnbtQkesngCviqjNI7euKnFZpYGoLyjx3LFi3bAgnJLv3UMebaso10tvI+/eAV1DvPAysrWPOPckYnkTrVxTawDbRWycQ9SaMBkojG44RV6+iChX1E56PEbTJKOdPlkiaScp+1GGpalUp2meU9E1XF2nrGv/L3XvGSxpep7nXe8XO+dwcpqc8+bFJiwWC4AAiESKwaRoW5KrLLlc/kWrXN5i2eVylS2SFsumKMkUKZAgDYIAkRZhgc15Z2d28syZk1Pn3P11f9E/3t6h/xAuqGh7/f2ZUOdM93R/p5/3eZ77vm46no+uCEq2x5Gohnn+JMrCAloihPLgU3D1HcSFR2TR9n1o1WRKXiIDexsEL/4IceKczAEYWVKYl8zK/6fVg+01CEcQh85CJIpSmIXpBVi7SXDpLShOohw4I7PmmxUZZ+s5cjKgm2NmwAxkcrC7IQmE7SZiZh4IcG6tEJrJoGQziEQCcfw+mfB36yLsbsvCHw7LDt625USg3ZH3RKOJSKU+0j76q43LzyEEMT1Ke9QlrIUlOU0ISbhDyNQ4wB3v4MOajEcduiMW4vNUhzU+VGpPRCYQQlC2KhQjBQJgp7dDWAthqDqHszMkzTgxI8Y75UtE9TDFSIFCOEdMj6EpKrlQHlWo7Pa3eafyLrv9Eu/tLfPO2ialYQ9BwKf2n2G7W+OJuXPMRGdJ6EnWums4ns354jlm4xM4/oil5AL1YZ0frL5PMmSy1qrh+j6uAvVGR4rvujb0nTElMgDXx0iHQQji6RgjMU6EHHpYgcuppVmOF6c5OjfFyfxR3itd5leOfoK4EeNm8zae7zGbKJAP51jvbPAv336ezx58GHOcG5AwEkS0CGEtTFSPstXbpjXq8tTsY6RCcWZjcxzJHuC1nbf4N1d+wvH8FPcXH2DkDdnub/Pa7lv0nDb5cJ6YHidpJJlPTHEkO89bu5epWxbNYZujuTkieoirlS2isTCGoZMNh3lo+gzT0Wn6YxuegHsuCzFG9Q694b2cA1VROZg88pG9hwG+ea383GDkMpePUW5a6IaOGwTYruSxz2fDLFf6eL5MkxuMXKJRGbbiuD5fPFnkVnVArTPC9QLOLKSIGBrLO23yqTAhXaHeswkbGrGQxnQ2SiaiETNU3t7sEjM1slGNQ/kIiZCKGwTkIzIobKM94AfLDVaqFsu7HUqlLoGARNTk+L4szb7NJ08WKUYNwqrGlWqH9sjlQDZMIWFIsE42THXg8r13tlENjY7lMBg4CEWh3erLJsYZgdWWnf2wOyaQjmNh85P4CIkPty36jsLMfJbDCxnOnpgmbmqUOiOeOVkAReG19SaaKjg2FUNRBJe2unzvtXWeOTOFKhSyIRMvCOg7HqmwxlTS4NWVNqWWxT84N0XEVMlHDOZTIb5+pcz7m20yMZ2PL+UZOT5Xax3e3enSGI6YjMk1QSEcYiJpcHIhwau3anQ6I4a+z2Q6guMFUkmvy0OnaWqcX8pwLJegPhyx0RxhuxLJ27Ec/CBAV1U6lkMibNDsj/D8gF8+O/Xz++hHt7fQMlFEbyCL94fglHgc8nmC9XX8VgelmMcfp5hpuQTBcIRd6RAyTVlY5uYIqlew37qMMZPDmEhDrih98a0KgdVDpPIEpQ3IT8kCf/i8JMs5NsGdS7B0DCUzFsz5LkGjJEfb/Q5BvS5hLsWiZLqXtuXXuRKPG/Tb0mrWacqdtevKf/ebfy5H+8PhPdGZmJ2Fan2M+EQGMiiCrZHLtKESUmRxt/yAjKby9KOLVFfrtFo2seMz8oY8fh6x/5jc7x4+hbJwFH/zNsHrL0jdQjgCMwswuYA4cJBgZwWRm5ZqepCrhHhSFnnDxPnzr6H/klxt+CMLUZgleOkniC/8MsrSyXuTjcC1UY7cRzDsS2FfrykPTnkNUgUozsN7LxLcuUlQ2sF+6zJqzCRwPYRlQSItvfcAQ2u8ejEkwtj3pZvAHKcAJhIIs8rPPEZ+BK7yoEpIk9nuhqrf87wbikFUi9J3+3iBL5PSxh27QGAoBpZv4QQOmqKOrXZN1jprxI24FNP5kry226tgak0Opfdzu3mXQ+n9ZMwMz84/jePbuIHLa7tvcb5wmvn4Al7g0Xf7lAYl2qMupV6N5Xpdho+oKlPxJNvdMpqi4PouES3CWneVO80VLHdEy27Ts6UW5p//9N+hKIKhNaIQjWKqKvvSOUq9Hqauy0hZVchfLVe+n6rAbloQ0+nUu9z34DHeu3ybkGHw6YfPkArFmYoVmY5OoQqVyVieyegU9WGNby2/RHUwIB+JYMzq7E/u41dPWbxdeo/PLv4Ce4M9giDgRuMO+5LzvFe+Sj6S5l9873tUnpA20q7dYyJS5I9e/An/4y/+Fiezp+6BcxQEn138FB2nTdrMsjfYZjIyQyFcJB8q8B8dzfHTrVe4Vt3gbnOHF6/dJpGMEQ2bDG2H6XgaTaiMvBHdMeFQDf42EjjAx1BMue/WIwy9Edr/Dzr62+tNUqkQ2/U+hqEydDy6Q4dYSGMqFePaTpf2wGYyHRmP8iUBLwgCKk2LoSddC6dmE7x0q8or18vMFuJMZqNUO0Ny8RCuF1Dtjjg3F+fq7oCTkxGmYmEWTkRw/QA/CHhjq80TCxmyEYOh49Me2Sw3BzR6I/YaA7a22ggB9fqASFhqArwxNz4XCvHiRp2bpR6OK4V6nh+QCmv8zv9xnXBYp9Wy2GkM0DSF6WKMwcCGxi4k8nLyGk5AcxfiubF4OwAzgtNqEMrlGS5/gLnvJGfvX+L8Qprb5R6npmPoqmC10iMb0TE0hdt7Xbb3utxNmOybTPD04QwHi1Gev1XnH1+Y5Rs3y+RjGrcrFqoQ7DYHzGajfPv5q8RCcvRvez6TcZNry3V+5fEF7p/KyNfEdkiYGs/sy3Kj1iWkqVT6IwpRk/lEBIjwT5/U+da1CluVHhuVLivLVeLJCBMTMcplh3RCisJ3+gMqfYfByEVVFeodCT7yvABTl+9vLqax25TBN3/X9TMLvRI1CRx52nc7FoFXxi63idx3+G8z6oNAdu+REKI1wKm00OIh1IgJ0ThBrYbwPLz2AGFoUmQ3OysteZoB6SLByCLoNhGHz0ngi6JKn7mqE4zasHhEUuFUHcwwQXmdYP0mweodUBTc22sgBBoQbK0j7n8UYYRR5g7jV7dk8e11ZGb9mHVPJIY4eOieTS0ol/DWNlEP7kM9cwrxygo+kmHsA0M/oGR7zIUUkqqK5UvC0g9fXeXMZJLjv/oA4sKDUJiW8bCZSYJmmeCn38dz/gbx9GcQ80sEd28TbG4i4kmo7kn//tYawbEHCK6/C4pCsHIHMTGFeOApgstvYFc66EMLQmGC134Mjz+L+PgnoVrCX7+OKM5LUV8ii3/jLcTBcwSbNxH5WYnV1U2EGcEvrcGh04ihxfA7P0ZNRtDuvyDDg6pVKVzsNglWbxDsbRM0W4i05BWID8f33a48GLiu7Pbtj3byl65q6IrG0BvSGnXwAo+u3WcxMY+pmvTH6WgfXpqi4gUyrOZDjnvfGRAEAa7v4gmB57uEtQia0Oj5fQ5llnB8l47d5amZJ+6hXqtDuTPvOX0WEjP3HiOkhlluL/Py9tvUBn2ZZNXsousaxWiUhtXnFw88gZgSHEkf5Ur9CuVBReoJ7IEM0hGCuB7lk4eOogqFgTNkr9dms9pgMh7n6aXDvPPuTdnJW54UwNoutEYQ0SGsQseGqM47P/2AA6cXefb8Ce6bPEY2lCFpJEkaKTa669yorfD9lYt86dAjHM7OkA03uFWvUR7UudVYJ6aH8QKfgdtnq7vDTq/Cq5srnJus8+TcQ7y1dwnbdpiIZgmCgL+6/QpfPvwYv/GxR3lp611c3+FQ+hCNYQNTNXl17zXO5E5xqXaRyegkN5rXSJlJMmaWmlXjkakHEELwtctvkc4k+PyRM4w8m/f2Nu7t9avDCgPXomN372XZRzTpSf4QtSvGrP6+O/h/96b8D7ja7SGRiESyDocO2/U+lUqfC8eKKELQ7I1Ix0yGjoeiCFRVYTB0SUQNQiGdsKay0+hj2S6W5Ui2gYBM3KTVt0lHZNeaDKn0bZ9fPTV5zzK21rKIjlGsC2mTqjUkFdIJ6Qpv73Z5fblBrT2k2x1hj2ySqQi5XJRe3+afPb2PiKYym4jw4nqNtbocMVu2S2fgcGQySiKk8sjZaZIhjb32kEZ3xMpag0IhyomDeVbfz8hu3rXHaaSqxHgLBaJp6NQgFGO4co3UqfuZnc+yUJDBZJ8+lmciEuL1rSZLhRh//u4uzxzLo6sKB+ZTrGy1SUd0vnGpRD4ZppgwWG71mErovHynweZel6lijLlclM1an3A0TCZmYuoK33l/j8+em+LovgxvrTRZrlr8wuEcK80Bk3GDt3dazKVNrlRbzCciXK+1SYcM8uEQju/zzOEsL2kqr72/QyIV4cLxCRq9EU1Do2c57HVsNFVQ6zvsNAZMpOT9GwQBiiKwxkmEAP937dbPLPSB46FlQnh9W/4+G0cYKqgq3rWbqIkxwlZV8XqyG1V0FbvURk1GoLqH2xpgnD0P11al+j6VQuw7hJg7KH3e42AXMX9EFl3PQWg6/qAHK9fg4Cm5k4+l5Jhm0CGo7kCzDru70r4HaEf2yUlDv0/w5suIsw/g7yzL8ffeNsTiMs52a1XunGeXJDCn1QB7hDhxBjWRgF6PoFqlWIjgeQG1ulSVRxRBSlMo6hqW75PSJL4wGqgYhiIz4Ac9OWUo70qWWrYg2f3ZnLSzHTkHS0cJ/ubPIZ6AgYry0LN4f/K78MGrBJfeQzzyOGJqGoSC+3v/PUosgpaJygPSh1Ci0djWVq8gzj0u/9xvEwgF9rYJKnuIh56VU49YUh6QdANl3xkJHDLDmHduEfR6+Jcvy9CiyUlpUdzZIGjWoNHAHzqoQOC6iA9TDG0bJR6VOoxQ6G/FlR/RKwikmG7gWIQ0k4SRwPFdVKGw0d0kqkfuBaU4voOmaChC0HV6RLQwvbF1LB/Os9HdRlM0VEW7d0ioWXVCmompmpzMngSgbUuwxWZ3iyvVOzyz8BiFcJ7IeIKgYLHT26EzGnKnVmd/NoPrejxyaB8Dx2Gz3eabyy/yzOIDXG1cxQ88Bq5FRA9xJLvISnML1/c4P3GC/ak5lpvrlPsdirEEU0dS3KiVWWk2yaTihEIGjc06dG2wPZiIoMQN/L6DnolgGhqWrqKoCo7n0XcGVAcN7jZ3aAyHTMfj2J6Hoaps90qcLhxnOjrNf/XS7zMdK6IrGp9ZeJb/4Z0/5HbrFj9au8xn9l8gmPVJhRL8d6/9BUEQsLg4RTqUYLdbYSaRIAgCTNVgp9Nh8cginu9RGpQ5mDrAjdoKW509Prv0SdqjFrlQbkwtjHAic4qe2+UzC8+y2tqlPhjwg5VrKEJwYWqW6XiBtt0iYSTo2J2xmFJHV1yG3oiYHrsXX/th8uD/9aD3Ub1s2yUeMaSIbuQxnY+hKUJy2VcbJCMGiiLjYOudIUEQEDI12j15KLxR6dPq2TxzLM/NjSa+75GMGHSHDtOZCMvlHocnY6iKxv1TKZpDm77r4voBd6oWm7U+jxzIkI9KOE7DsoloGrcqUv09HLqk09IWNzMZZ+h4qKrgry6VuLCYYqNtkY7IcjOdCpONmdzaaWPZLqfnkigCKh05FTB1lcMHcjTaFne3WrI+RBPSdu3LKS16CCJJ8Fy0wjSGaTAwQiRSUTKpMImwxq1Sj4sbLTa22xQKMiJ6Z6fDS7rKXC7K44sp/rBvEzdVzsylOFaM8s2rFdqWx15TPg/DUMgnQrz2wR6GoVKYSEl2ftNmJh9DAUaOz9p2m1/4VJ6+65KP6kxGw3zzSoXru10+e6LAXn+IIgQpU0brniqmqHRH/ObZKLqq8MFKjVsbTWzbI5UKEY/orFW6ZKIau02Lnd0OmZhJvTNE0xT2T8SxbJ9cIkSlKwV5mvt3i0p/5o7e+ca/e04xNHlcCECNGBKGowSIIEB56FHQVNkNeh5CV/FtD68/Qg3pqPsWEdUSqALNkDelsjiPcvIBgmZFCvUSGbk71wwJxWnXCepjZnyvC4GHyEzINxghRWgA5S3Y2gLXRT16SBZAQMTikrNvjyCVk918piAfp1mTf2+GoVaRiv/NVcTR07B+F3Z3JK//Yx9HrNzm6s0KA9+n5fp4yDXm6XwMIxAcO1EgpgkKCZPibz4rhX/3Pw39FuLwWcTUPJS3EZOzBBurcPEdgtWbsL2GWNoPlkVQrxC8/gLi2S8RvPmSPAQJgfXXz6MfP4xy6jSVP/0+yf/6vyR442VEKgWtJiIWJ9jZQMztQ8RTBG/8CHH6EZnE53tS3Li7Jg9RV9+G2f1SPa9o8rWOJmWIUHkLr1RDmZki2NhE5LOyeFcrYFl4jS5qNAS2Iyf0sRhifhHKJZlz3+0islk+ysCct8tvPRfXo/j441hJl6FnE9IkxvZg6iCqUGVmvBD4gYfn+1jukKgeIaJFaIyaY1+9iqHqqEIlqkVYbq3SHHbIhzOkzCQxPYrjO5StCls9KSBzfY+wbpANZceHDpmxjoDt3i4r5SpCU5nMpFCEoDoYcLI4yWOz5+mOu9GyVWUqWiRuxNjplWgMO0T1COVBDYB3dlf5tWPPUB3UWW1VCWkav37sKd4rr7D63ir4jG2yCow8Dp5cxFMCTh7fRzhkkErE+M1HH6UYzfDs/DNY/oCPzz3KozNnQLgcTC8y8vpcLm/x8tZVXtu+yMFsllK/RqnX4uu3X+IfnniWrz4ZnoQAACAASURBVN14gVwkQswI829ffonPn3yYmWSCb7z6Jv/T5/9jvnnnDTQloD0aMRFLcaO2zrnJfczEpvja7W/x1OzH+KB+hYCATDjF3fYaTmDztZvPc2HiuFyrCEFcT2KoJkvJKbZ669wpV3h0YYnXt9YpRMPMxqcYeiO88QFJV/42Qc9UDNKhNK7vYvv2OLNA4Uj6+Ef2Hgb4315cey4Wlzt3hCAa0hjYngTguD6fO1VA11TKnREjR3LuPxTlhcMahqFRaVkITUPTVYQi8bLz2Qjlzghr5DKTDqMICOuSuHe7OmCrNbqn8tZUlYVUiEzIJBORFjsn8Lm916XRsDBNleHQAwUqlR4H59N85fQkpa5N2FDYazvEwyq2F7Be7dMfOqTjJq2BSxAEXLlb5/BcCtcP6PRtMokQX75/mreW24w27sga4Izkbh4wi1OgmaTzacywiVBVHrowTyZu8pXjE4RCggcWUjx1rEgqbnBhPkkyE6XVt1nebvP1V9aYnUzQ6Dvc2u3w0vUKDx/KsdUYUEiGKSZMLt6o8PTpCaYLMV57d4N//ssneOV2Ddv1qLUs8ukwl5ZrnNifI2pqvLHa5mAhwts7HUKGJmGufsBex+E7l/Z44kAeTVXojzxm0mESIZ3JuM5Gz6bRGXJwPs3mbgcjpBE1NbwAym2Lft8hHNHRVIVoSCcRMTiYD7HRsHA8Ka5Mxww+f6L485PxnL/8t88JVcHrScuRZ9l4XQttpkDQGyBMDXwfv1RBCZuyGIx3tvqx/dDtYH2wgp5LYG+U0c8cQzz6rHyzOg0Y9KG8OwawuHD3ivSL20NZlBUFcgUpwktkEYoiKUgXXybY2Zbfl83es3w5l28QlEoohTwcOAnuCJGflcE1owGs3ZZK8rt3EKfOy3F+vQpba9BuIR77BMqjz0Knjrp/icgHV9lrD+n7PkNfWo5s2+P84SxCVzHCBukvPjEWEUZlcTVC8nk3y9CoQrWEmFlApFOI+x5DHDgByTRoGsoDT0N5A9IZhO8hjpyC3U3UYRdiEUQ6S1Tpy457axPn+jLa088QXHmfna/+hMSnnyT4s3+NmJ2HkCl3+tG4HPGv3kZkC4hDp8dI4CuQyskwG89FJPPQLqMgnRAin5OCwKElA4YMA7/TQ82mCPoDRDwmtRMz87C1IUf2vo/wfZRnfukj+yF5pX7pOamKl9a3nj3A8V0SZhzHd8Z7e0tmuY8Z9iEthKooFMJ5Bu6A8qBGwohRserkw1mSRoLSoMxer0bcjFLu14joIWxvxKu7b9MaQ3c22nsoisJcfIp0KE3azOCOLV9/decFqoMB+3JZCtEotcGAjm1TqrVYLleYSEeYT8xQtWocSh8gYSbo2l0uVW4T1kxu1ct84cCT2P6Icr/Om7u3GLoj/tGpL/Lo9DlW2mscm5riUm2HXq3HOGcTAqiXmuw7NofreARBwOcunGE2PoEfBFyuXaU5amP7Q67VbnKttkq5XyEZihPg8uVDj/H0/ANMRvNkwkm+dOCz7FnrHEwvYft9npx7iEvl6zR9BxSLiVgOJSZIhkxeX7/Lre0Sv3bmCf706iu8cvEm/+zRr/DbL/8b5pIppmIFNjo7LCZnURWFy+W7HM/t59GZ85QGZd4tXyRtJlAVBce3KYanGPhtPMVCFXBucon55PS9XIEPd/W6Ir3/chqjElJN2nZHwnKCAEWoH/lC/8dvbj4Xixj0B7b0lzuyiM8W44wcD13XcHxZQMOGhqrIRD9VUzg6m6LUtCjX+uQzEXYbAw5OJVnIhil3bfpDl6lMhJEXUIzrNCyXnyw3cH1B3/ZYL3cxdZWzc3EWkzF0RVAdjLjb7PPdyyUGA4cLxybo2y6t1lAmtLk+d9catFDIxQxW60MuzMaJGgqVnst6tcdEOkKja/Pp4znaI59qZ0hpvNv/tfunOTOT4Ic360xNJdnrqTj1sqwTQnJTvHYdEc/IUbaqcO7cHOmYiecHvLnRZKM+pNJ3ubTTYaNm0Rp5NPs2Q8fjkyeLPHN2kqipM5UK8YUTE5Qsj48tpblZ6TOVMhk4Pr6q0B66ZKI6vqmz13W4eadGpdzl3NEJ3rxeplbr84UHZ/nL1zfxgaVChDtVi7PTUYQQ3Cr1mMuEOTuf5G5zwCvrLXJxDUOodC2HQjxENCRo2NLmmEyYqIogHjYY2hJ6VK712T+bojt0SETGwVcxg81xoRdCNqJfPj358xd6+8/+9XO+7SJ0FcXUIQhwG33Mpx7FX11H8V38nT3c7pDAdlFCOsriAqou7sWZutUWSuAhVBX1Y49DfDyCt4fj2NppaZUbDqQyfjgk2FhBqCrki4gDp6WCXNUJOjWCiy9JxX8yhYgnoF4j2NhkeHUNo5BAffRjMrjmQ+BOryUfq7pDsL4qlfinH5Aj/GQakUgiIlEY9BGzi/LrAx9aDdTtDeq7PdquhyEEMVUhpCi0GiOmJ2Po2RhqMoaYnJaPF4nL793dlIcYM0TQbqLc96TslDeW5SGnvEOwfldaRIJAHjRyRYKXfow4dgrx5GckUrjTIlhbQ0xN4a+sUb+yA+9fxjh1iMS5Q3gvvID62S/IQ1G2KCcvmkZw4wPExLTc93/9TxHZtFT867qMoDXkrkcsHJXoYdOUr4FjE/R7+Nu7CNPAq3cI+hZCV2XMsOMgCkXY3CCwHUQhj7dXRfvF3/jIfki+uvfKc3EjNg6rkDvZgTNkLj5D227Tc3r03YFMTxvvbtNmCkWo9yx4XbuPjzdG0MZojJpY3nBszVKYT06TNlMMPBmEM3RHXK1sUIwmOVU4wpncGUJqmLAW4W77Li9uvUk6FENTIGaEWGk2sByH1bVd4okov3Tmfs4XT6EIhbX2Fj4eXbuD5Q25Wt3k4ZmTPDl3H5er19AUjYOZWabiKVqjLrmwPISMvBE9Z8CNvR3qnS7YssiDAEVQbbXZf2iOeDxCOhxiLjGB4zsUIzmG7ojtbolCJIOmKFwq7/CJhQscSi/yTukDEB5v7Fziem2TPWuL+eQk3777Gk/O3c9Xb/yA85OH+Y3jn0FTBS9uXGGlWudQvsjuoMXdlR3eXL/N2cVZHjtymD+6+GP+i/s/Q3vUoxjJ4uPRHLX5weolzk/uo+cM+J/f/CaZqEbciDLwBhTDRUw1hI/P/sQBDmcXMTWNQiRPVJOrmK7TRVd0Ok73Hv0wpJn4gU9Ej9BzpB89pIawXIvjmVMf2XsY4F+9vPac6/moqkI0IuNO2+0hnzk7wc1d6e3eaVj0Bg625xM2NA5OJjANlb7t0h+6OK6PGwT4AeTiIfq2h6kp9Ecu8ZBGahxEs9WSWNjt+oCNUoeFYoJfOFFgfypGRNfouy6XSl3eWm2RjkuLV6U5oFodIISgWmphj1w+/cQBDhSixE2VreYQJwgYOAHZiMZeZ8T5hRT3LSR4ablFMqxTSEd45GCW5XKfjh1wZbdLxNQotSxarSFWszne0StSy+V7BAiWTuxjYjLB3BjE4wcSImPZcn1wsBAlGtJ5/VqZY/NpJpMhbpf7DFy4vtOhO/K4XR2wvxDl6xd3+ZXzU7y32SEe0vjPHlhgvTPk3bt1ypU+85MJPAVqlS6371Y4c3yKxdkUr9+scvZAnoipETZU3ABsN+Dbb2+xbyqJEIK/eGmNYj5GLioPU9mwTiKkEwQwnQjzwGwKR/gYukbI0FjMhlmp9IlHdGqtIYzjbEO6Sqtvk42Z7LQsPD9gIh2m3LL49ftmfv5CP/rqHz0nVAU8Hz0TlR19b4h5+gjCtgjaHfyRi74whVtroWgKzvoe+B5KMoG3V0FPR1EyKdTz5xAnHxzbJGwJxhlZ0KrLbltVwfMIrn8gO8VMDnH2MVDl/jTo1gl6bWhWpZVuYwNvZZ2g3UEYGvqBeQbvr+Bcv40+bEj0642riJPn5d5/eglxbCz2+5CmFyDV7YoCjRrcvoEIhyTz3XXQcMi7I+JDKUicMQ0evTDN3JEioTOHUPfNI+57BFQVMb0fbr6P893nUSfyMLsou/kLj0Fpg+CH35GOgHQOhhbC9wguX5T7eyGkHfCTX4B2A5GbhN0NxKmHEGEd6xvfQ58tEn/mEYK9PbSnPo7odRD9LmyuIx58DFZvSQtfpw1bm4jZBYIffx/xpV+DlZvSNjeUApagWZIYYd+Xa5FYQqpXfRcuv89ou44WlThdrz9Cn8xKTYDvSyb+cCjFebqOW2qgf+W3PrIfkm+WXn9OVRS8wCNlJhk4AwbukPnEDK7vIpDj/Kgepef0URWF3Z7kyMf1GGWriqnphNQQCSNORI/QHDURCMJ6iNaoS6lXZeiNMFSdmtXk9e0VQprGQrLIw1MPMfQk4rMxqlG2ymx0dvnp8jKlXo9bOyUGtk0qEiaVirO2vsftWpmt0S436qtcqezwycUHQAj2JZc4P3GExrDBSmudhBHDCzx0Vadr93l9e40r1TVOFRd4Z/cGju/QUwJ0XUPEDEYGxIoJzj96nPmlSQ4X8uzLZHho6hQIwZncKV7bfYc/efU1DkzlWUrNsdbe4deOforGqMn/fuXHnCkukQmlURVBgMcrGytMxmJEdJ0frl3i8wcfImUmMVSDUr/MFw98kslUmN/9wXeZLmR47OgB1moN/tF9nwYRcKW0xSjo8uTcAyy3VwlrIVJmkppV41Bmga9df4PffugrvLD+Ps8sPDIWSfrcbt1mPjbPyB+RDRXIhTJoikpIC1GxKrTtDqqi3ouojWgRhBD4gTzw9ZweYVUeeOujBufyFz6y9zDA7z5/57lwWEPTFFIxk8FQ7sWPzKWo9Wws22PoeOSTYTrjbPmNSg+hCMK6SmsgR+HZeIhCMsyxiQi1vouhKmRjBpXOiHJnRHfkY6gKW/UB1+/UyOeinFtIcW4ixdDzxsjbEZvtERu1Addu13Acn/WVMo7j4bo+qUyUQX9IqTmkajksV/qUGwOOTCfRFMG+TISFXIiX7zRoWD5hQ2Vge1INX+pz8YMdbAEHphJs1wfoqsJg6BGEEviRFERTJGbmmDt1FCOZJpsNM1uIkQjrDB2PwxNR3lyus7LeJBE3GXpwe7fDkyeKdIcer1wpMVeMY2oKmZiMxH33epm243NgMsFfvrbBkfk0RwphVpoDwrrKfUtJMukof/HtKywt5pifSdEfBXz6/BRRU+PizSqxuMkDi0kubXfJRHQWMiYYBrNpk5euV/jSQ3O8crPCmbkkEUPB0CQGeD4l7814WKMYNjENOJCL8OpKk91Kj5CpoWgK/b7NVC4qFf+OT6AoWLaHoSooikKlZfFbD839/PY6PB81HgY/QIRDBOU2ekpiaNF1As9HMTVJSEOK99zWAKEqqLaN0FRENoN49CnE9D6C4UAW90FH5sJPLBBs3JKCOEWVY+N0Wk4Dpudlt+s6BB9a5VZvEKwuS4GbEAxv7aFlogTVLvrQwZhIoh/bj8gXYOkQIj8tGfmeO/YPW/D+axCO/K3a/u3XJO1vbk7uzddXEDNz0loGtJsWzaHDx47mUZNhzKk0/khCgcTMLMHbr4JpEty8AqUSatRg9OLrmOksQbuJ+wf/AqfSJvKlT4E1IHj+W5BOI7I5xLHjBBvrBNUqIhbD+V9/H21pFkrbBK0W4vRDoCj0dtuEzkYIrl/DOLaP4R/9MaHPfQLxS/9QInYvvyELcSoHd64inniG4I2XIBwmeOVHiAOHCUqb0loYiiJcB3/zpgwFKszJIJxYWgbdzMzA3T3cZl/ulJNhME283TLq3DTUajjlFvriFPT7GPPFv79Ps/8HLs/3pa1qnDvv+h65cBpDMVCFih3YaIr8MbDcEZqi0h51x+NhF1M1UITCXHwWP/DZ6+/hBz6WO2ImNsl0dJKbjWV6dp+4IX9gp+JxEqbJdKxIa9TEUAyEqlC2Kry7d4Ur5TIzmRQ7zTa7uzUmJ7Ns7FWJJ6Jks0keObjEZDTDifwR8uEc7ZGkEo68EV2nx9XaMnOJCWpWk4Ez5Fa9zF6ny7npKTbabX689h6niotUB01s26Fca9Gpdjh+Zj/drtS4hAydgesyq4f5m+U3mIjFuNNY41q1xMRUlh9cvcFcYhLf93jutX9PtdLil++/n91ele/evcxSKkUmHOPJhYOU+g3eXNvgQDHP773xXfLJOAcyGUr9Pmfzp1lubjIc2cQNg6uVCg8e3sd/8/1/z3/66JP8L8/8UyzX4vm1F4kbESJamMawzf1Tx/jr228wtB2+euOHzCczvF16n5n4BGFVsgl+sPU8+XCO09mzRLQYNar446CiYBgwdIf0HYuoHsZUTXpOj7SZojVqYbkWSSPBwLXIjFPxPspXKKQRjRpoikIyolOq90kmTRQB0ZCONR7xBgHUagOmpuIMhy4tZUTE1DA1lUIyxIF8mLCuUOo6GJqCoQkm4gbFmM67m1K82B3J7PmJiTi6qqApgp7j0HdcggC2O0MurbfY3G6ztJBma0fmCUxMpeh0RqiqQiIV48ShPGFD4+OH0hQjYcoDa5wvENCyJMzn6HSc3bbNZrVHr2/T69lMTiXx/YArG00OT6e4s9dmMLBp75VBKITTSdrVBpqhEU+EiUXk67JS6jKTi7LeGDsAbI93P9jlsfvmCBsqz7+3S6tlcf7EBKXmgGtNi5lCDNcPuHCsSH/k8sqlHTKZMG/fqrBWiRA2VDZKXf7xEwtUOiOyxTSO59NoW0xPJ/jqT9d49PQUv/3FIyhC8O52j+lUmKHr8/52n2RY46/f2MT3A57/oMR0LsqtikUqrJGPGCymwvz5lV2WsiYPTGdJRXVSA4PL5Q6zmQh3t9qU632GQ49CLiInHM0BhWSIdt+m2hywNJmkazlM56J/5/3zMwt9+NA0dqmJb9loR/bh39khNJ+TIrpolMCTnW6gjMV3MxMo9Z60Xx09TvDqa4hDR8fKbxWG4y5eUWHlJsGgR1Dag0gE4boEVy/jtzqojzwCqby0qZkRGfzywesE25v4m9souQzi2HFijzxGcPPamJ+vEtxelXCXdBayE5Ib73mIucMwGshwm0wOEmmpkK+WZJc6OysjeI8elzvu/ATcvgqhEPljkxTPaiiGhigWwHFQbBtx34MwfxAlkSFYuyEFfrlt2NvG0HX8n76A2L8P7eQR9PwEpNIEyzdhehqx/wjBu2/IUfrEJCKXg3AU/SAE/a7E4eYnCdZuQipL7vFj0G4z2qximibmQoHg1k1EfhL/zVcQz3xOjrTu3oR0juBvvo545jMEF99CLB2QIKB6BSYlkIhQVIKIjNA4lEbI9+fmB1CvI1RFHuIiBmo8Ig9yQkAyibu2jVAVyR1wXcTJ03/PH2l/v1c+Ij/EB65FQZHq7YQRw/ZtQuMPf0M1sD2bpBkjpJrEjAie75MykzRHzXEAjX2PqCYQ2J7Di5tvEwC1wYCkKclvF0t7uJ7Hb576GJPRSVShkg8X2Oxt8MLGG9yu17Fsh5414ssnzmOeNnhvb/ne873t1xi6LplwkqgWYeiOKA1KnMufo2U3ea90mVw4RcKI0R528cbRto8v7sPzfe6bXCATTpILZ7nb3GEiEcdamiZx5iDpcJiJqIxl9YOATyzcz3x8DnO/ycs7rxPRQ3iBT23Qxclm+Zc//QGPHDvAI3OLpA8miBsRrlZXWUgmeXL+Pr5793XipklEN/mFoyeZTUxwMLNHc9hhLjHBhckcV+pXOVs4wq2TFYauS6nSxFBVJiayfHf5A6aiBf7V5R/zT848A8AbO5eYiGX54ysv85+f+xTvl6+zmJxm5NncbW7hBR6FcJ6wFubBiQdRhYobuBAIonqMtc4qHbszDrJxiOhyQjAaI3U/VNsrY1iSJlSm49P/H9yZP9+1MJOk0RnSs232TcRxXZ+ZYpz+yJfWT89HQ8F2PfL5KPlEiMHAQdMUHlpK8p3LZVxPZsnHTVUmxGmCas/h/bUmYVOS93RNkcVtp0M8bvLpkwVCmsByPWYTUd4vNXnpToO9So9o1GB7t8Mjp6dxTkxSbllkMmE0RaFU6TFyfU7NRlGE4Eaty3pzxIWZGLvdES/erjNfiFHq2GxVe/LQYagcWsygqQJNUQibMlXOcXzCYYPM7BTpbAzfD5icXCQe1nFcn18+P4mhKiQNg2/drBA1VI4tZSk1B9iuz7e/f5XjZ+Z58FiRoeMRNzVWbQ/DUJnKRLi8Wme71GW6GOPo/ixHJ+Pc2OvSGTjsL8a4MJ9kpTHkQCHK5lQcRQi2t1pMTiUxTWlbXIlpvLPS5IF9Gbojj+36gHhYZ7Xc5R98bJ4be32OTcVoWS53Sj1GjkclaZAMaXzxSBHPD1AE9IYu6ZCB48mMAtnjBkSjUoDX6o1QFYHrBdQ6QzRNJRrSEAKeOPh3H1h/Nhnve3/23GhHhpu4W2UUXZOWq8CRwJR2CyWbxu8NcNsDtEQEp9wCQIsZKPNziGwesXhMjurtoRSrNcfxsdYYwKOqsHJXKt5nphGn75fFSFHB6hK8+yLBiuzkleMnEekMwfJtqTBXFMTifsTsIurZc+N9+wC2VqSFrdeGVkXmzu9uwmiE//KLeNdvsPudtxmslIh/6TNySuG5BLeuE7zzFkGljFduoueTiCBAFAu4q9sokwXEufthUnbCQX2P4Nr7iG5bCvKWb2O9fQ397HHZtRumfO61ihQPNhpw4xrudgVlcU4eNLpdxMSUtK49+yuwsyJfm8Ik7GzgXb+FUsjhN9qo+TREoxAOM/ruC+hf+ALB+28ihgOCprQKomkSB3zmfnkA2lpHzC3JAJtUBmGaiHCcwB2r6VUdEYpCYZJg9Sai3xvfAAFKPoNXaxJ4AUo6iX13CyViouiKPLQYJsqZxz+yY8/L9YvP1a02Q3dEc9iWdLQxFU9TNGzfJm7Ex/AbG1M1aAylmC5hxojqklcdM2JY7pC+20dXdFrDDlEjTKnXZui62J7Hbq+L4/ucKBY5mT9MSDVJmimqwwp/euNv2O50mIjFuDC9wFImy49WbzJw+ihCcCg3zWJqkmf3XUBVAlZaO1yq3CYfjY9T89rcai5THtSBgK9deZtb1RLPv/k+2+UGv37/x3F8h51ejZc2lvnh9fepDy2Gnodp6miqSjIU4srmDsVkgsfnzhAzYoS1ENca13lh/QohTUFXNN7a3uTi9VW+/NAFpuNZ+o7FT1bv0Bh1iBkG5X6f9/buslFtcGpqls5ogO3Z6IpGSDP4zNLTNEdNOnYf23O4UV/hZqnCZCJOYzBA1TX8IOBkscgf/vTH/Lcf/1W+cedlunaLvuPQs+WqY729y2OzZ0HAa9vXuX/qCLfqm8wmisT0GIpQ6NjtcTiNTlxPkjZT9MZ7ec+Xh6C4EZfFfwxCao7a44hanZQps+0nI7Mf2XsY4C/f332uXO2jKILdWh9Nk7x6B4EfgOP6zGSlonwwdEjGTFrdEa7roxoGRyZjWI7PqakY7aFHZ5xZ3xnKkJtGZ4gypn8urzSIxQyyqRALuQiZsE7SMFht9/iT17foWw6RiEEmGWIyF+XGWoNoRKfdt8nETA5PJTg2n0QoCte22qzULAoJk5blUR24bLVGlJsWQQBvXtqlWutz7fIa3a7D6SNFupbLXqNPuTngvat7eF6ApilouoZlOYRCOndv7RFNRnjoUJaQrpAwdJ6/U+fdWxUCRcH1fG7drbOz1eCLnz7BfEH64N+5skfflz704cil0h7S6YzIZyNU6hZuEKDpGqau8vC+NJoi2G7bRA2Vd9dbDCyHaFin0ZLx0+l0GFVR+MnbW3zp4Tleul1l5PgYuspec8DAcqj1Hc7MJXl3rc2b18qc3Z+l0bNpjzyKYydFz3FxvICQphI1VfZnouz0RtQGNrqu4vsBc4UYlfaQwchjsRhjZbtDJKITNTVmM2GKMYOTU4n/ANX9X/3xc72NBqFCHLvWw+2P0NNR1KkihEK4qztojzyMaFRxSk20sIZd6hCMXMxf/KwsLpMLCM+VTPbS1rhTDkO3I21gezvQaOA1OwhFoOw/IANg5g4StKqy63SG0sfd7TD6wcuo03lpNfN9RGFC7qUVhWD1NvQ6iANHIVuU++fqrlT2Dy0pVLt+jf7VLQAyn3+M2L4idFqIhX2QSCEyWVACRLsti1yzg13poEUNhAgQmoZ4+Gnp62yUwBog9h8l+OA93Dffw+9ZmIcXUD736wQvfA+aDbxKnaDeQsHH2djDbfTA91HVQJIDNQ2W7yCmZ+DmRUhm5KSisovIF1E+/mmC119G/9jDiJNnYWMNFAX9i19CzB8muPgG4r6PIQikuPDgMcTCQTlluHkZcfysJA6qUq3KlXdk0t+HlsRwHBQNEU0gFvYj1m/CaCgPFeEwwhpIdXI0gt/uouiqdFkUigStJsrDn/7Ifki+vPvyc5Y7IqSZ9Ow+A3dINpwirIVQFZXmqMWB5D7qwzqVfp2e02fkyfzqQ+kDJI0kbuDSGDZYbW+y2dnl7d07ZCJRyv0ms/EcdxpVOqMRtVaXXDzG8fwclmexP7nErdZtCuEClteha/fxfJ9vvXWR/VM55pNpHN8jG4lR7rXwA5cfrV+hM+rx7OJDnMofJhNKs93b5Ub9Lu1Rj+lYgTd3Vrl09S6JeIR/8tRTjEIBQ6/Lsdx+JmM5JmMxKnYXa2QTDZn0B0NKe3WSyRhO4JOLRvjk4mMYisFGb5OBa/GJhQd5eesyP7lxk37P4ukzR/nKwU/xe69/m2Iyzka9gS9AEYKVUpV2d4Bljah7Q0KaRqnf51a9RCYc4u3SFfrOgJc27rDTa/D43Cl+9dgn+Yvrr/Kpw8d5dPYI290qluvyOx//T5iKTvFn137K5w49hCogG07w6MxpJmKycF8sX+eh6ZPMxmeYjU9gKDp/cOkbNEYlVEU+p6SRQlVUNEUnE8rSHNXxAg9TNdEUla7Txw98QlqYvtO/xz5IGAmG3oj5+NJH9h4GqbqvVPukUmEa886bwgAAIABJREFUjQHDoUTcTmej+H5AvTviU8dzbLds9ip9VE2h0ZSxsL/y4LTkpxsqjYHLzVKf1UqPG1stfKA1ts+1WkNGIxfPCzAMjclsBENXOZKLsduzmIlHKA1dyk2Lft9mfb3J/vk0oZBGpWmRiBoYmsJu0+LqWpO25XB6Ic0DC0l6to+uCW7udGj3bR4+kONOqcvmWhXDNDh3fgGhaQwcyW6fSEdAEQyG0nqnaQq1SptBd4CiaeQKCRamEnxsKUVIU3h5rU0yrPL5MxO8sdLg7lqTXsfik48fIBM1+MZP7zJRiKHqKq7rySCYgUurZRGJ6DiuRzxuSj3AyCUe1nn1ZpWm5fL2lRKVvs3JuTRPHsnywvslTh8psDCVoNWXMb+/8/mjVC2by6tNHjqYw/FhOhPh6FSCztAlYmqMXJ9HjuSJmSpHJiMEwFdfWudmfUB16GEYUIyYKEJa0efiYbb7Nr2hSzYRkiI/L8D3A5IRk2Z/hKlraKrC/nyE5ZrFJw7lfv5CP/iD339ODQKczhAjF8PujjDSEVQViEQY3VhHT4UIGk26qzX+T+beK9bSNDvPe/7875zOPjlWzlVdVd09Pd3TE3p6AoNGDEOKIk1aNmBakG98Z/iqrnxlwDeGTRkibQVLlCwOwxDkzLCHEzpXV1dVVz4575z3/nPyxVds3lA0REhwH6CwgVM4hXPO/utb31rrfZ837k2QDY3M1VWki1cFgCWJYfsxtGokB3vEzzYIPn6I4k2Inj5DnqoIEMtgiFydQpqeEcIxw0Qy0mCkoVsnOdiBZhPt5lVot5EWlpFOnHsuQBuQ7O+Kom9NkOwJeK4IbhkPxec9h6Reg1wOLSUR2z7+wx20gsD0hh/eQfbGsP4U+4PHaJWccA10x4RDB1mRUG9eQ/riN0T++6gvFPqDHsmT+0gXr6KcOYWsSzjvP2T/n/5bym++DP0+3n4LfaYI5TKSY6Ndv4T6wjWk6jTSpWuiw87nIFcQvH7PgX5fTBDkCO7dRpqqwME+kpQQPdtElhHBq48/Rlo9KaYblWmxh+93xITi2SdIX/oWUjYPjQMoTpE8+hh6PTFBGHQFCS/wkPIVkJ4T8CYdOD5C0lQIAoJGH9nQkDUFPB/Z1MT3XCoiaRryS1/7zB6SPzz84a0oiQmiECQJK3AomwWQIKWa1KwmRSPPwBtyNG7ihB6GolEy88ymZzi2jjme1HnW22GzV2O91+Hx9iGPjo9RTI31boulfB5TVRn5PicrZc5VVnihehU/9pnPzCMh8aDzlJY9pu+63FhbomFNuDl3llfmX8AOLOqTPoejAd84eQ2IaNodNgf7fH/3Dj13xOXpE0x8hyedY7K6DqbGcDDh3v4+U6U8O90eH9W2qVltPq4dsrl1TGWqgOP5OLaLYepM5bO8srTGb5z/+5SNCm2njRVYtO0ef7jxLi8vnOZrZ67jKQ7f/+gR//RH3+W3Xn+Vx+0m9WaPdDaFqaokisQLi/N84/x1iqbGjdkzBLHNdCbDbHaKD4/36LsOuqpy2OoiaQH/4sGPOTVV4WmnyWwmxzubG+iGxpPuFg37kC+vXsYJXVZyCwRJQN8dUTDyPO5u8triS8xlZtkabrOYXeBR9xlNq8/n5i+y0d8jIqBu15nPzKMrBjIyk2DMwBtgPs8taNtdTFVHkRXc0EWTNYznxV6WZJaza5/ZZxjgf/nexi0Ay/Ipl9PYdkA2KyA5S5U0j3d6ZHMp2mOPRsvCcUMMQ2VtsUAxq7PX82iMfXbbFkftCfXGmO1nNTo9hziRKBRMclkd1xPkvNOrZb54psS1mTx9L2Axm2bgBTyqT3CjGMNQkWWZVErjzQtTXF3OczTwOGiM6Q89Xjo/Q9rUMDSZ27t9Hh8O6Ux8lqYyeEHEs9qIen2MqmlYI4uD/Q7VmQLdrk237zByAo6eB9LIqkoUJQR+SL6U4/TJCpVSil+/OY8my1hhxMSPWK9P+P79BqtzOT53vsoklrhz/4gP7+7ztS+e5sl2j27Xplg0ieOEXE5nupLmCxeqyIrKdMFEe65sz5oqe7URjheSTmv0+w6yrvCjB01mpzP0xh6GpvDgSRNVV3jUmtCZBHzt0jRHQ5+CqaLKEo2xz3I5RWPkcWIqzVJBpz4OmM8bDJyQSJb5wpkKR30XO4SjscNSzsTQFLKGiqLEbLZtgudZ8+2hg6mrqIqwPqYNBVNXmMnrZA2FV1fL//FiPElXUBQDOYix6iNSeRM1nxLj5ono8IljYi8gjhMhjIti/MM2hjUSivb6AbQbJM9Rqcr1F1DSWRj1UY6PRQSqoiDPzSB97gtIq+fFiD+KRMfpWkKI53nEwzHK7KIo8HEkfOOBT+I4wks+PYeUy5McHwjoS78P2SzJ+hMoFGA8xvpok/HAxXVD4gSK43XKP/c5tK9+k+TBx4StAZIsYT06Ivub30LXnyApLdy9NuqlMZJrkfz0u3DyAhzskGysE9Y69P/1W2TKadKvXCT93/8TTn7yEezswMIC6fMXSRo1pPkFlKs3xc5cVYX7QFbEiD5zRlyMttZhfx9mZlALBWg0SMKIuNMnCWLUfFNAbGZnSRwb7+2PMf/bF8S04rt/gHT1BeEzVUWyIA8/EAX9xDkIPKRMDmYXYOkUbD2C0jIM2iTlGSHK01PIX/4lkoM9sQKJIpIoRsmlibpiLSNpqhAg1utI127+pzrL/rN8OKFH3sjghcFzbryJKis4oUtOi7ACh0kwQZM1dEXFVA2iJGbkWQz9IR2nx6PONj3HoW3b5HSdb928SsHMUZ/0uHdcY/hcuDlfzPPFpReYz8zTdtpUU1Wc0EZCIqOlsIOAWqfPN05epfB8XfAXez9l7IuvXyuJHVsUx3TcMU4YfsqV/9H+I5bzBXquy/ZujXqrRxAKjrjtePzSazd4ae4SPz28z4OtA6amChwftviFV17gg4NDup0hn6zvsVYq0XJa/PneD7k0dYa94RHvH+9w1Orx3r0/plDI8MaVc/zP3/5H/PToHnfqh5ybmuKN1Qts9I44WZpnLlOlbrUJ45CSmWcqNcXnF3JYgUXRKJA3NjjsDXj9xJoQ4DWbeK7PerON63jcTe0BUDAMrs+u8W/v3+a3blZIqQb/00/+H7585jRTqSK1SZNn3Tp28B5e5PP5hev0vD5e5HN9do3V3Aq1SYuCXmASTKjZR1SMKQp6iYuly4yes+7DOECWpE/jaWNiwkR0im7kkVHT/788m/8xH4ahPB+tS3S7Nvm8SS6tM3FDOhNfCO+cgDCK8f2IUkmw0jtDh82WSRjH7DbHuG5IEMTMzuY4e7KCLEmMHJ9abcz0dIZCwUTXFa6vFJhK66x3J5RSKk4YYYcinrXbdXCcgF/+0hqaLLHTdXl0OMR2A6ZKIhY3rctsN2y2jn2KOVFY4zhhsz4in9IYDj0aR20CLxBo7nSBo0yKb375NKW0xmZjzMFeh5nFKWzLY34lx/aWizW2efjQ5fLlee4cj7m71+fKcpHbmx1sOyAIYn783i7FUoZKJc03vniK5sDh3rM2n78yRzmjcn9/wOm5PPnnuN+BE3F2NoOpyRiKxKO6zWrZ5L6uYNsBn78yx0/vHrN9MMB1Q0xTpdEQ681yOUMmpXF5qcAfv3tAOWuwVDT43T/f4OLZKpcW87hBzJO9HkMry8dxwnQhRVxNiJLkUz69G0SYqkR95FOzHJZkcdl4dXmKo6HHfs9j4gX4fkQurdMcOCiyREpX0RUZy4vJGP9hwuP/JzAnGjpIqoJCQvr8PEFnjFoSHOF4OEFLq0KdL8P4eMiwbVO+eRLp9HlBo+s0hep9f0/8o7YNzboowuk0zIg1gLS8Cpm88JlLsghkscdw710xjlcU5FOnSXY2RYfebpAcH4q0uqlpsCeCQDcaIF24KhLapqeRpufg6EAo+T0PrWCSKmcIeha+H7Hw9atIq2skjz4BXcdbP0DSVfRqnvDpFkFrQNCZkH3jutAlpLPi4jG3ItT/Zy8iSz6Zion+898kunMX+fQppGwe+Vv/BeiKmDr0e0iFkijuSyf+2g/aqomL0+EO8Q/+DPlzryEtLIoVQTYHqiJWGoaOnEsT7ByjnFzF+/AB2te/Tv9PfkR6JgUbT4Uf/swFkr0tJGKwbaRi+a9tcZmsuEANeyLgpzoH6w9EiqAzQco+F3NIMhRyYtVi28TDCcpMheC4i5zWkQApl4PJBGluHvnSq5/Zbuid+ju3ZMQBaQcuc9kqfiREWrqiEcQBhqIRE6PICvVJm54zYaU4y1J2EUjouj0kKcEOAjRZZn84ZL3TpOe6VLMZVgoFoiThYnWBqVQJK5igSCpu5DD2J/zh1l+gKyqynHBtYYUf7j7l5uwpHnW2OBoNmc0WWMpXaVlDPqrtESUxP3fyVbYGR1yqzjGfm+JJp07JNLGDAD+JWVio0u+PCaOIX/3Ky2Q0ne88+ZiUpnHU7CLLMmZKZ6vZIQwjwjDiV195iYXsDJAQJiHny2dJqQavL14j0SY4msQ//vybfG/zES8tnmY2U+a/uvTLTMIhVuDwrNuiaBgkJKzkF0kQ/P8/2/4IL7LY7B3x7x/d5strZ3lx4SR+FDCfK+OELsuVEh3L4tz8DM+O6lxbWeSjjT2+denz/Lv3fsq5pRnePdpAUmRuzJ3gB7uP0ZWYvGFQTZdRJIWu0yevZ5BlmdqkzdPeJhcrp7nbeszp4hpNu0XZLGMoBlESkdOz9L3+c43D6FPmvYSELMnktJyIIlZNlrOf7dH97713cGs49MSuWpU5s1KiN/Yo5wx0VcZPErxQaBIkRaLdtmg3R0xVc6xWMyCJFLupgkmQJPh+iOUEHDfGtNs2s7NZ0qYQt63O5KhkdCHcMxWCOOFp2+InGz3iJEHVZKqVNPe3uuSyJk+PhzRbFtOVNLmUxsQJ2TgaUMqZfPvFeTZaNtWCycmZHI+2e8iajOuGJJJMsZJn5AC6yeXrq0iSxDv3aiQSDHo2cZxgj20GA4epaoFiKc2br6yyMiUS32aLJpdn00wVUnzj8jSPmxMUReXzV+bpjj3eOF/hxlKBn706z5PWhIEd0BsLLLDlx8zldZwgpjnyeedpi64T0Rw6/OWdY5bm89w8M0UCLM3ksIKISycrrO/0mJvLsbcrXjc3O5w/UeH9uwfMzOT5YL2NaapcWyvz3Xf3iRSJE3MFFFlGliUGlo+iKjTHPv2Jxyf7A87M53l4MOD6SoF3d4fk0hJpRcUPY5bywubXtwTcyA9jHDckk9IIoph8Wueo7zBXMHlt7W/u6P/WQh/+wf91yx04REFEq2VTPlWFGLGrzuWQggC5mCPqC6V9Y3/IzHwOfW1OgFWyeRgPRHHzPBgOiXsDpCSG1VWh5PZ9QVw7ewlp8ZSg2AW+uBzoKSiWSdYfCcFaKgX1OpIMyfoz7NvPCLcPULUY7/ZD1FMrSFVh95JiMRFIdrcFrMb3BbI1ncY/6lD8wiXKZ2ZIRmOkU2dg2AfbZvjokOzJabxanySMSX3uMvp8WWBxcwURQlOsiNCdoz2S+3cIPnmGkk/DwT7Kr/8jkvWHwhq381hoA9pNoUkIPLAmsP5YJNhNRiKkp1VHSmeQLl4h+eh92N8VgsN+j/DpFrKp4e020G5cQVmcw/7B+xw8bTN1cR7p6JjooI6aM8SUwJ4INfzqabHCOHURimXBD3gu0iOdFe+LmRK+/qNdwRHIF4QoLwqQilOw/wyaTaKJK/DHY0ewwatlIaYMAqRCAfnGVz6zh+Q79bdvBXGAHXp0HYuVwjSGogMJGS3N+Hk3b4U2TuBRm/SZSueYz0yjSDJdr8fItxh6NmEc07QsbNdH01RenF/ADnxUWUaRZV6dv8ZybhlFUnAjlyiJmM3MUDQz/OTgET3XJaerDD0PU5P4/rMnPNo4YKvZQk5J3Ns74sXlRc6UF3EilzgJqaQKvH24wXQ6zdDz6DoOhqHRbPb58vULvHTuBG4YslqYoe2MaIzG1Otd5heqtFt9FEXh2zdfJNHhZ0+8hiRBxSwzm5mh5/b4uPmIP92+wzuPN7i+usQHxzv89o2v8dbeHb609DLv1N8njCN2Bw1GnockJRyPe9ypb1FKpWjbPdaK0/TdMXk9xTdP3+BuY4u3Np9i6Ao9d8JWp8s4ENGaL8wucrJa5U9vf8LG033On52hn/h8tLFDuZgjBuLEZ+L73Jw7zU6/yc3Zc5RTRVKaSCFsO33KZp6uM2Q2M0XeyHy6elnOz2MoBk7kUDIqtJwGXuTjRC45PYsbeRiKLih5kkKcJJiqwWru5Gf2GQb4lx8d3Ro/F9d1mkPm54sYmowXxJRzBpYX0epYZDM6YZRwfNChOlvk3EqJYlrDCWLCKKHRF1CbIIiw7RBVVThzooQfxmiqjKmrXF3KU82K8XCcgBsknJ3KEEnw03s1/CAmjGKKeRM/jPnkQY1WrcvIjhg7Ic22xfkTFYpZgyc10flmTY0Pn7ZIpVR8P8KyfIrFFPubNS6/eIpLlxdRFBlTU2i0LWw7YDK0yBWyhEFIpVrg6rkqThBxbiFH1wp5dU340TfaLhvNCX/+cZ1+3+HMWpmPnzT57a+e4He+v83ltSJv7w7xgoj91oQoSojiBDeIeLQ/IJPSCaKYSt6k3rdJGSpXT1UZOQF3n7awo5ixE7Cz26czcKhUUhiGysVTU3z44R79Vp9z5+ZxYnj4uI6iqqTTGgMnAAnW5vI82OpyZbVIMa0RxUIvcdh1qORN+hOP1WqGMIH2OGDsBOTSGtMZnbEfMps36boeux0HxwuJ4wRFEZeGUlbggOM4YTZv/N0Kvft7/9st3xIWDcsKqJydJXZ8lJkKSa9PEkTIKZ2wP8FvDInjhKk3ryEtLsLMvCgG3RZJqyH28IaBNDsjsLXNplDcJwnSlRtQmRNFxncF6laS4HCT5M57f93912pE7T5S6EOSoM2VSTyfsN7FuHiC4MkW0fYujT/4MSl3SLS9R9wZoJgazM2RNFpIsmAFxoMx8rnTuPc30C6egTAg3NgjNV8kGjqopSyEEdqls/h3HhE/fECy+Qwp9oRSPV8SwTrHRyi6Iqxyl6+SfPA2/v1nJLffQzl7RiTsjUbisuF50BCWPv/DeyilHMQhkpkiGY+I338f+cWXoHZM8HgTZW6auN1FTulEgwmyNSTpdgjbE+Z+6TWo19FyBurlc8LXf/YcbG8hnTgppg6nzguNxF+tT/Z3RMpRGJC0W/DgPhIRXLiOtHyaJA4BxPuAJOx9/Z4AIE1ViLpDtJuXhYDweYStlMt9pnf07zfevRUlCUmS4IQ+i7kqADktQxiHJCRESYQfBbTtHkkSc2HqJJqiMQksCnqe7cEBB6MRq8UihqJwqlKmaJo0bQsJ6Lkub6xeZS4zT5REgrJHQkEv8me7f8kP9x5x1BuQNnSOxiM2do7J5g0K6RRq1iAIQo5aPV4+s8YPHzxjd9Ljj9/7EEuLubO3w8T1ODFVIafrbDc7IoAmCGkNx6xWK/zrH7zL65fOYwcOu7U2C4vTtJt9pmcr+H7AXCXHu483eTQ+5E5th0nYQ1NgJbfMs/4Oe4M+qbRJJZ3mcwun+M76RzzZO+bt2iNeWz7P3eYm9cmE1WKRMI7Z7HYpmCZvPXxMOZ9h4juYqooqq/zzD37KV05fouGMuPdsl0vL8+x3eiyVi3zyZJeREmIHPv2Jzbe+dJP7zTrldJovnz3L3qDP68un2Oy1mc1mgYizleXngBuLp91d2k6PsW8x9i1q4zHfffQx49ji4tQqN2au0rAbKLJCWk1jyiYtt8XQFy6KlJpiElhUzBKarBLGEZqiYiqpz7wY75+9s39rMHBJpTQmI5vzp6YZOQHThRSdkYsXRnS7Drqh0u3axInEzGyOKEkoZQ2KKZWdlsVw5DFXSYuI1WqGmARNVfCDmFbb4vWLM5woG5RMHTeM0IQYiO896XB3q0scg6JIDAYu25sNjIxBKm0QJRKBF2BbHtcuz/G97z1k5MPjx0cohs6j9RamqXH5ZIUYqDXG2HbAeDCmee9jemGKnc0GL11bxg4jup0J1ZkinWYf3dCJE5B1hY1nDQZBwmFzwt2DAc1JyPXlHD+832CqnKLVtlhdLHBhrcx33jtE1xWBoK1muLsh4D5TpRRRnFBrTliYyfKTd3dIF1J4QYShKQRRzA/+4gnVuQJJApvP6qytlOkNXEolkwd3dilX8+JyVR9y8doKA9snCGJOnaggKTJrs3n262PyeYM4Tji7VCQB2mOfo55FgkStazF2A8Zjj082O4ydgBNzeb56tkTbCkhIWM6lkSWJvYGIq02eOyzCMOZLF6bpTHxsLyJtqCiyzJtn/g5ivOD3f/eWN/ZIVTKoSYKWEoQeWVeJRg6SKhP2JujTBbpbbeIICheWxGi98HxsP+yJPPPjY5F5Ph5Dv4904ZIYJ+fzghVvpIQILY7Fn9ATBSoMwHVxbj9BViXCvgV+gF/voU4VUApZtNc+R/TwCe5hD7s5ZuZXvkSwV0MtplFSGlI2g3//Ge5+h9qDGptPWhzv9JiyeqTOzItgnt1dwrFD0BwR+yFEMebPfoXg3dvEXoC+Oot85jSSqpJMxkiui/+9H6JUisK7L8swGePc2yD15mvIlSLUj5HyeXFp0XVwHCgU2PhXbzP97S+Lz4WhGIHnC0hRiPfjD1ELaZQTK9DpIOcFBMF6eoy5Oo2332bYGNO7v0thqUA4cmA0RFpZFq/LK0K30OvCnQ/AGiLphsD9hgHJ/h5SKiU4+BeuwLBHsr8NMoKFHwUiaU83oTpL8vCO4PBns6gvv0Ty4BMhHAwCkRZYqyF/4x98Zg/JDxrv33JC97l3PGYqXSBMItJqCj8OkBACvayepj5pEycJOUN4f+Mk5l7rCcfjES3LYrvVwU1ijoYjiqbJbCZLw5pwsTrLcl4om/ve4LklL6HtdtgeHJEAmqbyaOMARVNIZ1J0LZudoyYXFue4vDDHa6fO8d2792l1BkRBxD/+2ld579kGmayA8NhxxPuPNun1Rmzv1jiqd2i0+shZg+pUkZQh8+CojqapNOpd4jhGVRX+3rWrvPVEZLbP5XO8tLBMlES4oUfTafGdj+9yeXGOtVKRruPQdcY82NjnN199DeSYp90jKqkUs5k8XhQSxDHlVIp/8Uc/4n/8xV9EIsYJAzZ7PcopEzSFP//4HqViji+cOcG723vMlgvEScLIcigWsuwcNul2h2weNalU8rS6Q57WGpyYFvjd+VweWYKh5/Kj3Q0Oxy1kOWYuO4WuaLx7tIciS6wVp/j585/DiSw2ekdY4Yj5zKyIEyYiq+XI63k6bgc/9smoaWbTMwyDEfBXaYUKdmh/5ln3/+yd/VuOE5JKaRimTi5nEIQxhq5A8tyt7IaszOXY2esRRRGplE7KVJnKGzw5HtHpO3heyM5eH5AYWz6VYgrbC5lMfG6cn2GlZOCFMW07QFdk4gRqo4DHBwMcJ0TXFWrHA4qlDIVihtHIo9+zuHltgbWVMmbW4PYHO8RRDJLMz795np/8+CnLa1XiOGH7YMDG02OskcVof1eki0oy1ROryIqCltE5ro3RdZXafgszbWKmTa5enOHe3QPmFssszOSollJomvCQd+yQO3cPqM7mWZrLcdSakEgSD+7v84UXV2j2bJpDl7mpDClTRMD6QYxpqrz9F5/wq79wg87IJYwSWh0Lw1SpzhZ49Mkh5UqWhaUS27t9pqczmKZGImukUhq7220mwwnDgYOiafR6NkeHfcyUAYrEX4UiekHMw/U2zaGLYSgkieBY1JoTwjBmbUFcTGIJehOP5iQkY6gYqsw48CnoOiuFNJ80x0zcEENT+JWX5nl3q0/a0IjihHxaozl0+eW/E+v+93/3lqYrxJaHnjFIghAlZyJn0sjFPHI+R9joETs+wdChtFRELWWQXnhJYGR9T3i7JQnJ88R+OJMRnXy7JUbMX/9FAamJY3DGItQmnRXfQLtO9OEdYedSJGRDRV2ZJx5O0E8twtQU/qMtvDtPMK6cRlFiMm++AkGAdukckqYQ7DU4/Mk67sCBOEHTFOxJiCZJ+LZP5Zufh6Mj4omN3xQ3fyVrYKzO4N15TGx7aJUs8uyMWBu8+CqS60A+j3Ji9VOFfDKeCNCM5aCEruh2T50VlwJVFWlw23u497eZ/YdfxX/vLrIzFlOGxQWhYZAk1OkSUbuPPFOF0Yiw0UM+sYJ57SzevXXMtRlk2yWV0QT847ALgKrEYmrSaooLhOuKC4aqIuk6yYP7kEkjVadJNtaF3752iDQzB70u0sIy7DyFbhPiAKkyRzLqQX2fuCkS7jjYw9tpoOZMYssR4JwoQv7mr31mD8n3m+/dipMEPw4omTlUWSGrpdEVwT3P6Vk6To8gDhj7NinNoGTmKRlF3jm6T0o16HsOaU0jlsBUVbEHHww5Ho9J6zq/fOarmIqJHdoM/CF9b0DZEF3jem+PD9Z3qBRzOEFAOpNirVJmv97mK5fPYqoqf3bnIfcPDnj14ikwNX795VcYeBPOzFbJmjoPNvZ5unmA5wekTIMEEWsQDF2GtsNvvPYKHxzsEycJnfaAVNpAVVVK5Ry317fJ5TPouspaqcTOoMffO/UFuu6Aslng62eu03Y67AwGNMcT3CgkiCIOrCGaonBjbhVZkv4Kkc/t/UMOWl3+yzdf57uPH1C3xww9jxdm56hPxrQsi/OLsxx3B1RzWQaBx3G9w4mZKc7Oz/DssM7SfBXX9cnl0mQyKQ4PmmSzKew4wtA09gYDptJpuo5DlMQYqkrRMHj/eI+srnBleondfoeh77LVO+Z0eZ7NXotTpTn2R0c86W0z8ges5JYIk5CRP2ISWJTNEh23yziYYCg6VmiLXHpJ+swX+t97b/+WLMt4XsjMTJbewOHcchEJyKd1AWc5GmKmNPoDl6XlIrmcwcWlIluha7EgAAAgAElEQVSNMW4QkUnp+EHMVDUjsu0VmWbLYjLxWZzL8/XzZdwwQVMlDvoe9VHAYlFHkiTW62P2dtqYKYNhf4Jh6mQyGv2ezdXLc7R6Nh/d3mVihVy6sois6fzyG6c56FjML1UwTZWH9/aYjERyoyRLRLKOks2T1LeQpxZ48cYSu/sD4jhh0B2TLWSxxzanz87wySdHzC9VSKVUpktp1vf6fPXKLJuNMRlD5cXL83RGHq2OTa/nUGuM8RwPDxnDUDkxlydjqgytgFxK4+B4yHjs8zNvXuDd+zU8T3DxczkDzwsZjTzyxQy9no2iyARBRK9rUy6nKRZNms0J2ZyJ74fIikx1Js/RbhPd0ClXMvhBhGUJcWJ/4DKZiBRBSZFpd2wqRZNyMYWkSnQHDmMvJGdqNDoW+axBylD55HDEwItZLZk4YcTxxKc1dDm3UODu/pBG1yKb1vGC6HkyZ/x3C7Xx/vnv3JIkiSSMkVMaiR+hV3Mi4ESWQdOIe0Nixyc9V8A4swTz80iGIQpJJotkmkjpNGSzkM8T7R3ibDbRF6YEVS2VEt184wB2N8AQalHadZLdLeJWD3llEdnUkRYXiA+OcA+6dB4cYDhjrKM++S9eActCXphDWj0Fm+uET7cIa23klE7QsxgMPFwv4lFjTF5TuPYbL1P+0gtIa6eR0mmk8RCJBCWfQjY0oqGNfvMS2s1ryFeuCSeAaSKZJsn6U/x3PsT+y9tItoX6ysvi79Np1GuXod8nqjWFDc61cD54hDZdRHJs9FNLYl3h2siGjjRVwf/4Mcq5U3gfP0WKQ5RiVnx93yKJYxRdYfL925irU0iqQjiwIE4wFkpMDvqElkdqbUY4GHSd5OiYuNlFVmXxPh0die57NBLQnkzm+dQkFNoBVUWKYzhzWZzmhQqYaaGXqO8h6eKiIi0to0wGSLkcUX+MnEuLFL43v/2ZPSTfqb99S0YiiEI0RcWLAgxV/zTyVH6ecOYEHuVUEVM1KBp5Ok6PKAmZzUwRxMIrXkqlxBit1aXfHXFirsr12WWQBHb1SXeDB+09imYaWZZY721zr3FMlCRcX5jHSiKuzMzwuNGk0xlyZ30XO4kYjWxevnSKgetyc3GJxewM3995wk6nR3M0Jp026Q8mDMeWmBrsdgg0if/u177JtVMrrBbmyBgSTdtGliU0TSGVNgj8kNfOnuTGwioXpuZxQjFiL6ey/Gj/CT949IA/+ugDxsR87eRF/MRnuVDg8vwCTWvM0PV41qrT9xzefrROqZilPZpwcWkOK/DxkhhFllnM53nr/hPOLczwdO+YkeexMl1hu9Ol0x6iqipGSucH73/C3PwUqqIwGtsYho6ua3Q7Q2RJolzOAxDEMRv1Ft3xhHTKIK1pbPR65HWd2mSCqYIfx3hhiBUEJIhXO7C4PnuBmUyFufQMuqJT0IuMgxE5LYcXuZiqiRu5AJ8GGUVJxJXKC5/ZZxjgf//Rzi1VlfGfB7V4XoRuqKQMlUJKeLSdMMZ2AsrlFL4foeuqQOZmdEpZ4c+WZAldU7Bsn6PDgShS1QzL01lAQlNl7h1OeHo4IJfWiBKJB4cjGm0LSZa5dGYKNaUzU81Qq48Z9kaiQ7d8zJRJdSaPaaqcWymhqjL3nrWpN8aMRh5hEOE260TDHpGkwfEzEiPPG//wG6SzJtOlFLEsPd+hgyzLlCoC5Xvx/AyrszlyaZ2xE5DP6qxNpXlyNOTDO/vcuX9AvpTh8xdn8CQ4t1ZmerbIcOTR7zv0Jx4TL2Rnt0e+YDIYuMzMZMX/F1Mo31MpjQf3DpidK7K33cL3IzJZg0Hfxp44xFGMrKjcv71JvpSnXE4xGnlohka/OyGOYzRdI5U2cN0Q3w/Z3+9h2wGGqWKaGq2WgB31Ry6GqdLrOwA0GhPSWZ3R2MOLYk7MZJgpGKR1hWJKYTptIikxkqLQHnm8cqJA0wrxw4gwSihkdDojj9948e8SavN//s4tJAidAFlTiIMIJWMS9SeE9R7q2iKSYxG0xyBB1BmifeXLMLcsuOuTEcnuFvS6n+7k5XIJfWEKqlWkl74kLF3duhjzG6YoPoWSIMNtbyIbquC+yzL2uw8JexbdxoSZS3PoyzOkZguicOm6IOwN+6AohM0ukeWxf68GCTRHHllTpZLScLwIrTtAmwxRp0si8laWxS68P0FbmiGxXeRinuTpU6T5BaR0Fun8VZJnDyGKUHIpzJ95A2XUg04bPI/goIGsIorfyhLKmVPEewcoGQNZU5Dm55GWlsXlR1Pxt4+RQp/YDfA3D56ni0HQHCApMkFrjD5bQFIVZFPF2W6in5hHTiK6O13MrE7uyiqMLLS8uCC5zw7RVudF8EzKFAV+elp0+KkUdLvidxyGhLvHyKW8wASn07CwCq1jcCykQllY/5oHwsoISHOLuO/dFSI8CWRNIbFslJ//jc/sIflu451bAH4cfppelzMy2IGDF/mUjRIdp4cVOMRJxMizOFlcpWKWKJtFalaTJ50mx+MxjdGYtK6xWCowUykwn8vxyvx10mqavdEhdatDXhekrMXcPHWrxUa3RTWXZX84ZGQ5vP9gg8APaXUGXH/hDCeqFZSMzlI+T0rTqI1HuJFN0TCoDYYMBxOebhxQyGWwHIdMOoWS0QijiFEQ0PNclktFjscDZFkmlzZxwxDT1PFcn1Iuw5N2g1cXL4AEry++yI8P79CYTFirVvgnX/g53tt7Ri+w6DoOm7UmpWyKnGFwbXaJs9VZ7h8dUi7nCeOYs9NVzlTmgYgoSVjfOabvCIb5070aYRCRShls7tXIZFO4jkcunyFIElRVoV7rsLYwjaQp1Otd8oUML51ZozkciwMyDNnfa3B6ZY7hRPiH8ymTalrsKkumyVa/z1KhgBuG7NRapNMmUZJQME3WCgu07S5dt0fByJPTckyed/BO5GAoJgfjY4BPEw2DOOSFqZuf2WcY4H99a/uWJEl4nhDQRVHMdDlNFCds1EYsTWVoDl0sKxB5XQlcPVFhtZJiNq+z07LZPR4xGntYdkA2azAznUVVZVbnclxbyJEA2x2H3sTD1BXShsZy2WSvYzO2fBRFZmT59HoOm89qIEnYI5srN08wPVsgmzNZnc9jexFHjQmZjIbliR30eDDBmTiCjKqZyJpGoqfBMOn1PVRdA0UiCGIcR/yM6bSOpikMBw4xEhM34NJykVxa59WTRf7obh1JklhbqXDj0gL3HzWw44R226LZsUlIWJrNcX6tjGGqrG+0ST9P/isUTIo5IWRrtCyePdin1RyhKAqHey2SOMFMm7RrXSrTBZp7NfKVEkgignc8mKAZBrqhYY1spmdLTE0XsCYeCeIi1jxqk8lncCYOQRARxYJXEEUxqZSG54eUiyn8IKLfs4gTiWxWo5gzuLqYxw5iDroOxbRK0dQYByErJZO3HjdZrGT4eLNLnAhhnqrIHDXH/Dev/808iL+10POTf39LkiRkVSZxxd5aNjVkUyPojNGXp8GxCfsWqXOLaP/1byPNPMe6Ng7EDt4akxzXhGI+JQJSKJeRXv8mkmaSdGtCfT4eCVvX3JLwk++ui32+qpJMbCb3dpFkmXZtTL5g0NjpUZzNIr/xpsDeWhPodul872OczRp61qC52UXXFAA0JKYqKaIoQQKOujZrr50RNLo4ItrcJaj1SOKYoN7F/Jk3YDwSO3ZrjFSZIrl3G6lQEE4BwxCivEIB6ewFpJU15MAm2D5ETkKkKIKaiHuV4ojE8YTboN2i85232fnpBrHlYdUG6KpM7ASkr50gqPXQ54oQJRw/a1FYLOIdtIkdn9jxMVZnkNdWSCs+T3+0w8z5GYL6ANlUCXsT1EKa9rsbaKokmAeaBr0e4X4duVqGclmsUBRFBBIFAeFeDeXaVeEoiCPEjSMR8cBbj8REQJKIbt/Bb45Qc6ZAHq8uEPeHqH//Nz+zh+T2+NktN/TRFRU39AjjCFMVQTUdZ8BMukIYh893w9O8vvB5kiTGCi02+ruMfIuJ71Hv9JkrFZ6nn0nMZrP87ImvIEkSfW/AwBvjhC6KrHCyuIQd2uwNj2nZNk4QYNkurUaPbC5Ns9NnqpTnweMdsuUsv3ThBrIEe8M+Thjy44+fsF1vAdDvj0mShEw6RRCEzFZLgEQUxdTrXV67do6ZbIE4CXl8VKfZGRBFMYPemJ974Qojz0VXVZxwzIWp0/ybp3/J5eoKYeIz9jzmsgWm82muz57klfmzHLtdttpd2hOLrKmy1WuTTZlYns/YdjF1jZ1+hz955y6333vMwHUYDidEYUwcx1w4u0Kz3f9UEFhv9VhYqNKsd5Ek6PZHTE+XqKTTpAtpfvKju+RnSozHQg0+GEzI59M8frZPFMbouopp6vQch1p3QDZtcr4686nbwSYmjGN2jpq8uHKS1fwiYRI+5yFElM0yLaeJE7k4ocP++Bgv8sloaRGNm6lihfZnPr3uzza7tyRFENM8L0KSoFQwqeZNNg8GzD4PNBnbAWvzBb790jzLRQMnjPnkeMzQ9vGDiPHYo1rNMBy6KKrMyfkCXz1Twgpi+nZIGAuxlwRU8wZdK+SoYxHHCa4b0O1MmAwtNENj2BWrzvp+i1w5z5VTU1heSKdn4zgBT581GPRtAi/AtV2SJEHTNZAk8uU8fpiQeB5ev8fZq2uoskwurbG718N1fNp1EXa2tiYy5+M4IZc1OD2V4nff2uHMUpF6x6LVtri4VsaTJJams1w5UabWc3DdkKPaCM3UeLrRYX4hj+tGuG5IGMZ0ew6f3N2lfed9kjghRCUBQschW8qDBKqm0m30iZHQDA174qIbOpN2j6n5KVIpjWw+xcbdDSTdZDKaICsK/VafdDZNb+8QPwQjZZDKGIxGLlGUkMnoz4l8MaoqM+gLN02rOWZtqYipiylDWhfgnYyh8Kxtczz02WmMOeiJ6N5czsD3IxanMlhBxG+9/DejnP921v2f/t+3ZFMjaI2RTY3Q8dGKaZIgInEDtNNLSGGIVs4gnT0rxr6yBMOOyFs/OhD7+VxWdJaOI5CqF66BmRH7+FFfqNen54Q1LEnEq2NDu0XzrYcofkj2C1eo397BdcVDvvTSCsrqEnTaSPkCyZMndN7fpvqFs4x323RrY2w7xHZCVE0mDGOCMGZiCajCS792E2+3iaKDf/8Z2kKV0aNDcl+5gaZL0O9z9Me3SfbrSKMR6lxF+NLzBSEQlCTRCU/P4fz+H5LsbJOMLbbf2aN64yRRvY1EAmGEVMjjH3WQ4gg5m4bBhFxKQVVlMgUTkgTP8mk9PEYKIqLuhNjxqV5bREkZkCTYtQHZ8/PIs9OiSCcRwXEHnYjU+UXczQZyWmfv9gGOG7K7O4DmkMJSgcRxUKbF90+rRdzpIWUz4nedyyGfWBGXmlYNlk+LsB9VRSpWoTJN8u6PhbpeSggaA2RDJXZ8tFPL4i3/2q98Zg/J9xrv3kpI8CIfO3Dxo4CSWWDgjdEVlcXsnICnkFAxy/S8Hk7ocjxp8klrj+PxSIztcxlSqkrXtplKp7k6cwpZkmnabZp2h/qkw4nSIgNvTBRH2KFLGEc8rdV5ur6PFCe8euUstx9uMrEcMmmTK1dOsVou0XPHaLLEvb1D9vYb/OJrN1g/arCz3xCpe5KEaegggWW7eH5AJmXyC2+8zPpxA5eQBwfHFAtZ+v0xr108jatA27Z4671PGE5s+nGAqnookkQ1U8SNXKqZHG27z1x2iv/j7R/ydFRjMLH58M4TXrt6jif1JhnToGc7LBXyHDa7OHFEJZOmN5wQKpBKGUxXiqiK4HE/3tjH9QLGQwtVUTh3bgVZlkVs6mGTM6eWODMtnA9RHFNr95GRuH52jd2jJrl8modPdnE8n4ntEocxyBKO47M0XSar62z1uhz0xNhZVxSKpsmN5SUyWopnvR3OlNY4njSRJJhJTzOfWWB7tI0qa2iKguU7AhTjWSxm50iSmMuVa5/ZZxjg392r30obKq2uTSaj0es55PKGYNs7AUvTOfwopjfyWJjKcNgXXflOx+Xxbo/hyPuUBqeqCv2+Qz5vcHImyySI6FkhrbHPYcdiumgycUMSJCxPgIWO62MaRx0KpRwnTlbZeXpIMuqSKpdZPr3A9HQGCehPfLbWG7iOz9nzczSO+0zqdRQzhaIpqLqKrMjEUYyqqaQLOW6+foGtrS6ptE6rY5PLm3TbQ+aXqyiKTL/vsLdRw/NjumOfphOSyxoYukIiwanFIsc9mxOzeb7/4y3sBJrNCQd37rFy4QSN5gTTVJ+H42i0GgPiRHDqG0+3wMxCuoBqGCRxQiJJeI1jnMEQL5RQdZWV0/MkiDjY7u4e06dWqVYzyLKE78d0u2NUXWV+aYqjjQMyhRyjQzENNUslZFnG9wKskUV1pgBI9HoO47FHNmswPZ1FkmB5sUg+pbPVnHCimqHvhDSGHrMFnQuVHN972iKfMciYKt2Bi2GohGHEynQORZH5pSuzf+Nz/B9G6QD2RgNJlnDHHtHY5fBI7Iatwx6Sqf614GthQYxkAl/knKezSJIEpkm8u4/1w49JdveI213R1fsu1HZFJ58tiCKfyojUuCQWNLbNDYLjDgDpl84x+ekD/CDCD2LmXlxBWZgRuNzZOZKtDZydNpYVYD3YZ+qFJRYuTLN6SgBgFq/MoagyzYHHyfNTLMxneP/f3MH44oskrTb6jYtE3QFGWid4uEEShuz8yT0GAx+jmhNRrWtnCdd3YWoGKZtDWj2JNLtAsr9D6huvo7/6IurKPOf/h3/Ag3/5Pl59gHvUo393n8mH64R9m8j2se9vE9nCD/toe8BHD5v85H6dd7a6dAcejaaN70cMOjbRxMM96ECcUPraDWRdFRem5/bDhVdPEjsBAP22xfbtQyaTgI2Bw0IlxU7LYvLRFkF3ArkcUbsPpRKxH4mpRBBAq4WUySFNzYho4H5brEAUhcQaQhiShJEQEw7EKkFOG0iqInb+uv6f7jT7z/DRmHTQZJW+OyJKYhqTyf/L3JvEWJaeZ3rPmac7nDvfiBtz5FiZWZU1kkVRIjVSanaju6VuwbANL7wQDHjntRdcGm0YhhuGVgK627DdULcFdavVLbJJkSJZZBWrilWZVZWZlRkZ83Bv3Hk+8zle/KHSxixYK9UPxCoQgYhz/3u///++931e4jS+KvoxcZYQJAGuUbgStMQUjSI53SJMEixN4/Fpm588eMrJZIIXhOhXxeLJ4DkDX3QFble3MRSdm+VNkiwhSiJ+sP+c6XSBqWvcub7B999/hKoqqKrCa/eu0cznuFaqUbFs3jk7YzCYsPB8/urxHq/c2eXmtXXqFZc0Tblza5M0TZnMFty9s0O1WuRf/8VbvHltm/2LLl/e3aLXHzOdLXlwco6ta/z03Ud4fkil5lLL57hfv80PH++x4jTI6w7XSxvsuGvsjy/4zXu3+NWta7y8tsq3/ut/yp98920Wc49PD895vnfKzx49R1UVFEXm4/1TojgRBeNyzPF5j4OjNmfPLgimHksvIE1ThpMZk/Gcfm9MEER84yv3SZKEvK5jKAoTz+f+S9dRVIVFGHJxOeC9dx+Tphlhb8Hu1grdwYR+X3w5msazbp8t1yWJE7woIkpTjoYjKrZL0cyzlm9yNm/TcKqkWcYyXhKmIWESoisak2AmkgE1C0s1kCSJnPaL4z2/KOvgZIwiSywWIZ4XMxvPCKOEi84MRZEJkxRZlnjlRo3eVUBN2VKvDD8Ktq0xmfg8fXTO+fn0swCbiR/zaXvOaBmx6hrcWC3gGCqbNYGqlSR4+nzAbLIk9EOhPXncRjMNtGqT67dX2d10abgW42XI3rMuuqmzmC042O9z626L5q3r6KYOGWxfa+JNpiy7l+SKOXRT5yfff0y1muNwv0+j7nD0vEPgBURRgiRJXDw9JPY9DMtgd9Pl+kqB9352wNeuu5i6IAY2XIvHJyNeeXmNeslmY8Pld/7L3+KTH/2cxVyw+c+Oujx7dIqdExqa44OeYLVoJswGQiAY+NA/g3AJigaTLuF0yuXFkMV0QRzF3P3qfVRVIe8Y6LootLde3MawxH4izZjtP4WcC3HI7s1VkjghTVKCQR/H0eheTimXLeI4pXOlYZjPIyxdJUpSGq5FexqyUTKu4D4JSZbiBTGmrjBahFQqNpYpEhknywjX/sWfxZ97ox/97//8W1azSG+vj6LIKLKEndPxJz7d8xnluknmBci/9DXQDQGSGXSEZ/vpE7KeKFLGzQ2yxRLZNpFu3RHt5CQRreIkERGq50egGaAbZI8EpS4+71F87RoHf/o++bLN6emM7c08RjWHnHeQmi1xux4N0a5vYEzH2F++QzqYcPj+KcWShZRlpHOfg8sFoyQmn0Jpvcj2r70AiwXSygrZ6SnxaIn1+m3mHzzHa08ob5ZovNBAX69DFCMdPEN+6S68/y4YOvhLuDgXxVLTePbP/z3Vb/4y2adPqNQsJE1BMXWyuU/3csl45JO3VLRKjtSLmM9CjiYesyThRsGi7UekGViSzHwR4xZ1FCnDuraCrMpIqyvIrZYQ3CUJ0vYu6Cr6l14Gb4m3d8FgHDBNElq2QZJkNCsm7s2m4PYDwdEl2loDuVQkvewJS16Wga6RvvceUv8S6c59QdA72YdKExZj0g8+QDZ1kuEM/ZffQBoN0NbqSLvXoHv5hb7R/9u9P/tW1XbpLcd4cYytaRQMh4yM/VGX9UKVWbjgVvkGYRqRpDHPx4e05z2eDvrMwpA0Sbm5tUp3MMHNO7zYWGceLikYOSqWS5plvNd+Smcu2tO6ovHdwyeUr26+13Za/Ohnn1DI2QzGM1brZRIJZFWhYltMA58ky7i51qQ9GPNfvPkGP/l0j8PjDpubTcaTOcPRjN75gMyL0UyN1lqd1mqNzmzGvdYKz4ZDkiTl7716lx/8/BGnFz1u3djka/dvkUgQJQl7o3Ne31znL/Y/wlQlZuGcvVGHKElwdJ3/+d/+J/7pl7/Co94Z+UqBLE2xHZMsyfD8kMlkQRREFAoOYRBBBosoIBn7bF9vMe5MQBLumNl0SaVUQFEUDEPHMHSqOYeGW2Dki/93u1RCVRTe2N5k4vt0emMWCx/d0Eh0ifFEFHvT1CkUHFJFotMeUHJzbJZLnI0nvNhsUrRMkjTiB8d77I8ueaW5w5rT4kH3MZZmkNMcDmcn4gCWxtyrvMAsmtPKrVDQC3iJxy33zhd2DwP8sz978q2Veo7DowHeMsS0TRxHJ00zzo77FEoOYZxys+mQSgI8czwKeHo2+exw4PsRK60SQRCzulpgverQm/qU8wb1vPCqf3QyEiDNKKVgazx81seyNMIoI1fM0bvoE4URkR9gWCbIKgsvFp9RkoSiKmxvlbm8nPHmG1u885M9ZqMZbtVlMV3QbQ+vospljHyOza0yfigS41ZWC7TbM4qlHHfurLK/16HfGbJ6bZ2XX9tmOPJQNJk4y9jYKPPWkz45W7gmLoZi9BMnKX/13U/40qtbTBYhnmSRpiKxkgxybo5BZ0CaZuimTiyS1sGwSeOEtesbTPtjsAoQLAS4LYnQCyVUXSVLM+qNPKWSxXAkAnHcoslsFrC+7hKGCb3zPkhgFIsk0xG94wtat3YIvAg9l0PTdaajOUkmUa/nCMOEZj2HqgkNRLu34HKwpOxa3GrYHA48Rl5Cq6jz8HxGfywOcq/tVlgECev1HDcbDscDj9976f/7Rv+5hf7if/lfv+WuFUlGC5yqQ+nOClmcojs6w4sZ1WtVZF2DOES6cVeotZMYTg9IDo5FprkuLAWSZSLdui0Ibd6SrH2GlMRQaZAdPUMqVYUIr32KlC/AbIpaynH65x9SrVkMu0tWmzaFuy2UchHp7ovCPzoeApAcnqBvrRCfdPCP+1Rv1JleTHh6PuNZf8GKobFaMFl6Ca2v7JLOlwKeUyoRH12I39Efo7sOmRfSPhrTPxpy/uCMla/dRXrpFSHIq1ZFYl1jBSZjqFSY/+kPWPl7r4LnER6cI8ky0/0eeslGLVgU6zk6h2Nq60XSZYC+WsJQoN2eYckKWZoxSxKiq+cuZ7C2VUJ17c8ARdLNF8BbiMQ+2yYbDkQwTRSBrpNrOFSLKv4Vg1lMFjLKOxXksks2niDJMtl8TjadfebT9x7uo2kZUlnE30pkgn2wdcXel2SkwQUsFmR+hKwrJJcjlK9+Ff/PvoNatL/QPvp/d/AX32rlK4x80YJfLzQwVI2cZnMy6XG91PoM5bqZ38BUTPr+gEf9M04uByiKTCXvsIgidEPjTr1OmiXEaczZbIgsZbhGnvN5n7V8hbpT5nH/mK2iS2+5oFYq8NP3H5OzLeZLn821OlubTdZLLr+ycYswjTifTYWafzDild0N3j44pN+fsLO9yuXliMvzAbPhnGLDpVjOs/AD7myvMZzOcWwTTVFo90fYjsVRf0jesVgufC57Yx7vn3LRGfA79+/wavMajmaxU6qJFr5d4mw64EZlhf/jr37K7339dXrLCU8ve8iyKKqVqiuKfZZx+OiUnVvrTCcLCsUcZBnj6QLHtel2hhAkkNdJoxjilHK1iGUZZFlG0c1xq1JFliVmYUjBMOgvlzRzObqLBbam4VYKlOsuWZJSKuTIORbj6YJa1cV18wK9LEssk5iL4YTdepUkTfmPP31AbKnUHefK/qjwfHzE/foL5HQHW7WZhGPm4ZJl5KPIEmESsp5f48Pex+T13BfeXvc//clH32qsFBlPQxazJTdvNZFlmRvrLk/3LtnaqiDLEl6U8cpajpWCztk4oD/xGAxEQRLWsQTT1DAMhTjNUBWJ6TJCviKteWGCY2rYhsrB5QzH1pjOQgxDpXsxILm6ZTfWm9SbRVZX89y4gsGcXswwDIXL7pzGisvB8YgszSg3yox7Y6LZBOJIYMTTjFSSqNYLzOchtVqOKEo/49WHUUIUpyzHMz37zRUAACAASURBVHwv5PSoz3Q05etvXsM2VCp5A1VVmPsxOUtE5BYdnQcfnvHVX7nB5XDJ+cUMWZaZj+dYOQu37DDsTQiefYhUrBMGIflSXuhLsoycm6e/fwjeFKXcIMsQIm23hpWzsByLai1P2bWuMuFTNE1hvgiplm16/SWmqeCUimj5Ik7ewWk0sCpVhr0xds7GtE2SJCPNMgpFodovly0cS+MnP3yCYVuoqkylIvgZj9tzSjmdak4nTDPmQUqcZmTAPEhQZInfeaHKv/zhETlb4/dfXv3bt+5NUyUeL1EUCXOziqSrJF6IbOlouiLAMrmcCGGZjcVJbdgjm06RDQ2l4iKpCuFZD6pV4Zdf2wK3hHTtNjRa4F8pvEP/b8R4WUbWuSR4ekKpZJBdJff4foJ/2OfyLz8he+9nYlRQbxI/PWT48AyyjDROMHfqzI8H9Po+DVvn9XUXgDhKqVVN3v83HyJf20H6+m+ALBNPlpCk6Dc2CbtTup0FAK3VHC/9998QosAkIfv4AdnTJ2BZZI8/EbS+OCb3pZtIL9wTgBtVYbp3Sa/n0fm4TXQ55ejDC3Z3iszaU9RyjiyMiSYeWyWbIE2ZxAmqJDGOE6I0o1wwsHbr6FtNtFYV6cWXBbq2sSq4/k4OaWtXvEjFElLBRXrjK+g3NnFsFS8UKvnWZhElb8PqKmF7jFK0kBQZpeDgv/0RRBH27/02lEpIO9eQVltCY2FaQjuRJqJzEcckc2EDQdfR3nyV7NkTZqdXxMIv8FJlWYiAZIVWrk7BcEizjCgV8JckS3A0m67XJ0gCBv6Ai3mPi9kMw9QxdY3+dM75eY/b1So3yxu8VL/FSq7Gays3aOXrV9G3OcIkom5VuVFeAyBMEp4cnCHLMjnHAmAwmPLJ40N+/OApPzl7hhcHNHM59s4vefzkUIwFFj4rKxW63RHdwZhcyaHYdAnCCD+MMHWdP/3uO7yxvcHvv/AlCoZBkqSEYcQr6y1mU7F/kyTh9s1N/uCbX6czn6MpKj85e8aj3gnL2Odh9wjXNBn5U7784nXebL1Ab7kkTVL2n59z2R/zfO+Uxdzj7KLP7t1NDo7a5As2i4VHGMVYps7SD0QEsmvAJETSFYq1Its7q2y36jTrJd5YbTEOfIqGzVqhgK1pXCuL9norX6Bomry+ukm9mEdTFaYLj9nCw807xHHCnVqN07MutiNUy+vVEh8+P+b5cMh/99u/xm6pxL36Gq18nqrlsuOu0V5cIksK03CKKqt4cXBFwjO5VtxlHIyZhwuyv7a7fIGXlbMYTX0URabWLKFrCvN5iCRJ6KaOHyY4psrZYMH5JORsEtKdeLTbM3xPjAo7bdGC3moVeGG9xPWVPLWixYsbRVxLoz8LUGWJmRexVTapFU0sQyVNM4aDOYEXYOUskjhhPplz9LzNJw/PGcx8oiSlXLY4PR5y9MkBhqF89ncPOgOCyQTZFofDLMvgisL54J1nbG1XuH+tSj5vkGWi7riuSeiHGI5NmqRsXFvhn/zj13i418cLYx4eDBjOA2RJojPy0DUR4bp9rUar7DCfh0Jb8PyU5XxJ96zLbOZDBuXXvkown2NaJt7cI45isiRhMV0g512obpBMBiCrkHNprDcoVwu4rkWrmccLYjLAtjUsXWWtniNOM1YaOSxLY71VoFJ1SNOUxXRB4AWkSUroh6ytFZiOZhimQb87pV53aLdn7B0MeflL12i1CtzeKaPIEi+08txdKxDFKboiMVqKuNvZQghRW2WbrZrDD56PGAyEtfYXrc+90Xt/9IffUiWJOIyxNypI+TxxeyCKvwSaoaLmTXHL/NIvQ/ec7LngoxPFpAuPZOajr9WQGg2RoDafinZxEosOQOCLnxkMRFCKosBsSta9JDgZEngxJ6dzsiyjVDKxbzTI31wVB4zFHPaekSUJua+9yuKtj1EsnU9+sM/50P+saI4nASDR8yOiRUyr6SC3u+ibdfA8ssGIsD1Gd8U8RPYCyqsFnG/+Mp1//QNymxW6/+YHhKc9rOstMZe2bcHG//c/xqzlREhNGCK3muh6RrlVxNQknj3qc/3NDebduSBZqRLx2EPRVWxDBi8VyU7Aiq5RNTU2dlz0G1fuhZUVEbVrmII2GEVCy6BqsL4LeVdkA9g5JF3DNQNyvoeqKuiKhL69Ar0eaqNM1B6S+hHJzEM2NRTHgMBHWlkVB63GqsgWePIxUrkKVk6E3ciSAPGQIdcqpI8/RYpCWAZozRLyb/yTL+xt6NvH//lbmqIgIdHK17FUk0XkMfDG2JqCqeo4mo2fBKw4TZ4M99gfdxj5PpqmMhxOCYOIW5urNHN5tt01lpFHdDXnX3EaeLHHW6d7+ElElProikp3OeF4OGI+XZKmKScXXSSg7Oa5/8IOdkHYxTrzOf3lkqUX8Ltffpnvf/wUt5TnZz9/wmQ8J16G1OolFksfVVWYDecEScJmq05nNsPNa8zDkMFiyWgwJdAkZFmmP5xy7+4O37x9jz/6zl+xs9bgj77zPUazBeVSgYZTQJGhPZ/z/Q8eU6+6QMg0CHhxtUlgyFRKBXI5iwefPOfVl29wdiE0M24xh+8F6LqGLEmYusbCD4Tw1lbJOzYVN0+96hImCa1CgbHv4WgaJTNPkiVUrCJ5zWa7uEor3yBJY5EymPhUykUuh5PP/udK1eVsOGa9WWUyW6KqCt3hlHrNBUmiM59ys1Knu5hwv3GDzcIm3z9+lzRL2C6sY6qmwOjGC5IspWjk6Ho9kVIYzikaeV4o3fvC7mGAP/z+wbckSSKOU3RdISMjy0Q6nWHqyIpIMstbGmslk/3ukvEyJE2hWLS4bI/JFSwaK0U2G3luNEQozMyLCRNYKxmcDX2O2zNqrsVgEWFoCt2Jz2TiMx3PkRWZxfkJkumgKAq37q6RZjJ+mDCZBYRhwnSy5M4r23z6yTmqptLbOyAJAuifkOUroqMrSeAvSIIQd7XBbBayiFOCICGKEmaTJXGCSOLzQ1Y3G7RaBb7z5x/Q2qrx1o+fMhl7uGWHvC2CXdqdOaenY3RdQ9UVhmOf3a0SsmlhOjaSLDE4Pmdle5XueR/dtsi5OcadHo5bIElSUt9HUlUUVUE2bXTHQlEUCiWHLMtQVZkoFkQ9RZIwNAVLV0jSjNWKQ61gYqgKSZaRSVCtOgz6C+Zj8Rlq522GgwW5ghDxmZbOeOxTKolZvefFFAoGfphwq1Wkkdf50dM+yyChVhC1qWxrLKKUKElZcS0+OBySZhAlKcWc8Qtv9J8bUyvLEmkQYdVyJDMPpVgkSzNyv/Ea0g8+IPVCSBLCzhjj/JDs/ASpVodmCykKkaZjlF5PqL1bm2IGn8Rili/J0D4R+FVVRXr59St/dyRu6pqGpMgMRz7r6zl0W2cyWGL7MXLDBt9n+e6nIho3Soh+8iGypfHhjw6pFAwKEkRhSpZlPJwuKaoKdU2lE0XY04DiSp5sOoEwRF+vo9/YJOv1GT27pHSryXvfe86bW8/p9jxyPz9AUWTsVfcqGGaGtLVDNnqGs1PHe97BqlTEQ0sSZE3BOx2i1wvcuJcRDxcoioRTydE9nlAo6pgVBymKqVVNcp7KaBywvpajsF1Fv70tDhJRJOiB9ab43XEEu7eF4DHwIOeSzUZId75EtvcAKnWkb/x9iuq3KfgBkmMLWl6ng//4GNV1UJtlkt4IpdUgPrskORtgpCnk80iTMZwewmQC58fQ2hGJgnZOQJK8UHwPkO7cxW42xWv7BV5BkqBIMiWzQJwmuHqRKLnkdmWXvZGgyfmJz/msi6M94+PeIWv5EjfLLQb+hGNnxNl4wuVszjd276NKCuNgxopTQ1d0HnQfsz/qUTAMXmnuIEsSXhwwDQK8ZSCsbtMF1zZXRUxqGDPxfYqmyXCx5LI9QNVU3FKebz98jKLIvP2jh9RXK/hhRBjFKIrC8mICroHmGER+SJqm6LrG+GrevVErc3O1weFgyKA/4f5L1/nBd99HVRX6oykfPj+hWnVpNMs0HIfz2ZhblVWGnke9UeZiMKaVzxMmCeezGWmaMZnMKRZz7Kw3aZ/3cfMOmq7y6bMTSsUc+byNqoqbm2MZxElKveJSLDpcW2+S03Vyuk5nPmcll6No2MRpzJ3qNdZza/T9AVWzwjAY8XL9Lg96j2jlK9yrFzmfTKnMili2QcmxWUQRDx4dUKu5rFVczuOEa+UyB6MR+4cXFAwB1jmeXDAJ5lzM5ziazjyaU9ALlMwS9vISVVaIU9F+3ils46jikPdFX6oqnEOlkkkYijz4KAp59Wad0/6CwdgjSlLGy5DDgbjZVwsmDddicWUn87wI348pWCpzP+Fk6LFVsRktY376fMhFd0GlZGHp4jWNk0yE3yxCNF1jOVtS3N4hDmPiOGY2CzFNlcUipN8RI9TVjRqnJ2OiMKL3yUeojXXIIE4TcsUc8wdvwVXHC9NBkiRKZYdcTvjbsyyjUrHo9ZYkUUK5UeZkT3APSBIO93uU62U2t0qkaUZ/7LHZFO3/5TLC8yJGs0CEyswClsuIcX9MbaUsFPNdwZsAuNw7xHCFYFuSJIxCnmDpoVomuqlj2iatNRdVlT+z5GmaQt7SANiqOay5BiejgFZRZ+zFbJUNPjydAdB0LS4uZkiShHXVicpZJkfPzijXy1y7UafTmVGvOvSHS85PhxiGSqFg8MnJiP7MZjoNqJQssgxaro6lyTy6kJEliVkQU8kbfO1Gmb2qw2l/8Qv3z+er7pcxYZhgvfmiYKqnKcHUh8kErewIkZfjoL+wA0Eg/OyyAqWKENpFkWDZ338FLFukt6WpyGPfewRhIEAtqkr28YdwcUJ2fEAWBGRBiFK0cIsGVj3PYuzRuL9GcDESxUaWsXZq6Ot1JkcDzC/dY3I+ZbuVY3XbJY5TPhoveLczRZVg3dDphBHXSjaOo2HdbCG9+hWktQ1xQ5dlZh+dUv+1uywPe5RtjbA95tZXNlByBmbeYH46EoV3c4tsNCDaPyNZBli31yFNWXx4wPS7PycazLFfv8n+20d8970zpr0FZt6EJKNcNii9to3eKGLt1infbFCt29y8U8Mu22i1vMDnuiVRfDd3oLn2Nyl0hk3mzcEuiCKv6kiWI4ryYgaVBlSrSFubousBYBiYWzUURye66BONFmTDEWo5j/H6XUHqcxzwFvg/eId0MBKCvMtTEVXre+L1cHNQraL8zjfJDvZFkf+CF3o/jkmylJJZIMsy4ixh5E+RJZlWro4sSSiSQitfZ+RPKZkWUZpQd8roskqSprSKBX77+gtossoiXqJIMlEa8XHvKXGasFYoEiYJ75zvcTLt8KR/TpSmWLYhTu6mjqzIzBc+K6sVLs56LKKI69UK5UqRe7vrPH12wm+9eBtvGXDj9ialUh7PD/HbU04+PBIiN1Ummvs0GmU0TaVaKvDVtbvslupEqRhvnZ50+cbr9zjYP0e1daaTBb/05j2qNRfHMem0B8zCkI1CmZE/5Wmni+cFXGtU8aKIx3sn/PzJAZIk8eYL13j/wVOev3cgsLqaQhTGrDbKbGw2cRyTer2E6+ZYqZfZ3mhSKNjk8jbLKGKtIISKrzQ3uFff4UZ5k61ii6JeYBJOKOpFRsEYCYmSWWKnuM488shpNo6hU68U0XUNQxGF59bNDXJ5m7PhmMl4ztF4zGzp86v3bzMLQ1p5F03R+PfPHnB40SXJUnJajiejT/HjgGXso0gKZbPES5WXOF+c4ycBcRp9zg76YqzFXKjQN5sFJAmCIMb3Y5aBYJ87jo7rGFTzJoNZgOvoxElKLW8gIQ4K1arNq7fr6IrM2I+p5AxGy5iTvtD1VEoW46nPpycjRouQ9kg4gFRNwVt4pGEAGSiqgmEadM4G+H7M1qZLqeZy6+4aB5+ecvtWHTtvU9i9iZN3iIMAIp/5wTMoNiBfBlVHuxr7KYpErWBi6YIDH8cZk+GUnet1/KUPiQhz2b63S77ooBsqhwcDNE3hxprLzIvodERxzeV0giDm458fcHjQR5Yl1rbqnLz1Y0aPHuBWiqzvrpAmKValglMQ3Qmn6GBYBoZjkSvm0HQN09Lx/ZhSwURVZXbWXa6tFqgXTZqudZUDn7Ba0BktYwxFpmAq7NRsgighzTIMQ8VyTCRJolAw0XWVG3e3KFfztNszZpMl7cs5g8GS3et1giCmXDDYrOV4djzi5KjPfBkRJBnfedQjSTPiRAgl7604/O79Bj/ZH7MME+L0F4+gPrfQN+420U0VBgNkQyjl4yglWy4xvvF1lLxJNhrDcCiKe2tTFKN+RwjWin8d+KKIGXPnTKS5jQZiFqwooOnQ7Qrb2OY1cWOWZcLulN5ej1wzz+x8gmmqBO0xkiTRf2dfHCKA0VtPqP+DLzH6Dz+ltFnC8xIWgyXtWSA2uCTR0nUGUcztssN4FrH+P/xXjN7ZY/atfyba1ZUKix8+xNqoMH9vj4/3xziOxrOfX7D/7inR1Kd3MaPyzTfE2EHRYLkU83NVwfv0jEf/6seCM1AwefrOKSf/7j1KrsFLjTzFZo63H3ZIoxhZU5l/fEqyCEjmAbKlYW7VMLerWDt15Nu3hcguy5A0TTAFlvMrpaoi2vQgWuqSBHaObNyDYCmeXZqKv3E2E12BMCTujUkWPtF4STxZEvRmnP7oOcFxFy4vic8uydptcMsYX3sDubUibI+WDd02NFZJvVAc2Dod0r/8jkivK5WEg+ILvHbcKjISpirebIok1MFhEnLdvYYqq4RJhCLJ3C5fZy3fpOGUeTo4wlB1arYQxQCcTi85GJ/xsHtEey5av2uFBgNvSXs641alwXaxhQQYisLJ8SW94ZR61WU6XVDI2wz7E/wg4tGjQxRZxvMC3vrgCf/Nb/4S/9dfvk1rvc55p89i4RONPUgBQ8Fs5kkvl9RXK/RHU/7gV3+dDz96zv/4J/83G4UmrXyehwenrKxW+N4Hjzg+7rC+UuXwuM3Dj55zeHDB+XmPX3/xFqaqUrPLTAKf2ysi1vnxaZv/89tvkaYZtmPyzruP+N57n7C5WqdxewVFUXj63nNMS7DPT447KIpCmqZCUV8vicLfKPNia4Vb1Sq9hfjwnYZzht6EMI0I0wgv8VEklTANkSWJol5g5I/QFR1LFRkEtqoSpSlFwyDJMkbDKbO5R5qm+F7AaDTj0afHDAcTng0GREnCWyeHGIrOqysrXGs1aOXrqLJGQc9jqxbLyOeGe41FtOTd7rsik8AskWTp39Hu/P+/NrdKaJqCrspomvjYzjIRtfor10tU8gbTZUgQJWzVHOpFi0rB5OMT8ZkpwlQyelOfvc6M550ZH+4P8MKYOMlYqzhM5wHTacDWSoHmFbFNkqB30Sf0Q5ySy2w8Yz4RB4PACxj1J9iGymK6ZP/ZJd/47Rf52dvPKZYEp345X8JiDJHADkuVVRhforplosmYN17b4Hj/kj//jx9hagqVskX7YkypWuTZkwvGR0fYbpHueZ/DRwcMu2NmkwW3bzeIooS5H+OFMY6j02/3uTgb8fCHH+JWXaIg4nzvmMuLEdLKLmprl3F/wuFPf0ZttUoURgw7fbIsI0szJFmiXC+jqAqarlGvO5RKFvMrC3OWZUyWIggrzUTUrSJLLK+eU6uoM1oKJK3r6CRphmkq5PMGuZyg/F22R4xHS8IwxvdCRp0+5yc9xv0xo5EnOtCPLhnOAzZWCly/2cQyVTRZYrueY+yL//lr10u8fzLjX719xsVgQSMvHBi/aH1uodcbLqprE7VFW4YoEmD+QgE0nag/E37qeh0KZVG4k0R44r0FWa+L1FiF5qrg2YeiSGWjPuSKZPvPyM5OBRZ2c1ccELIMRiOigUAohsMFlqViNApo5RzGWonCVoWwPWTxaZvAT3j0L36IIktEowWVus0HR+KWYMkyFVV4SV/aKjKYh9x7fZX+H/4xJyczcn//l8i+8+fg+6RRzOjZJdOxT7NoMp2F7NwooxsKqmOgG4qIgt3YJOtfQhQxf2+P0fMe5naN7etldNfmu++estK08fyYXt9nNA7on05542YZreSglkSiXtgRLXDZ0FDLeeRqWbTaDZNsNCA7PRHPfGVDiOPGfVHoF2Ohvtd08SUroJtIjU2BHV7OkF58RTzr8RhmM9RSjsO3DsVMPknRXZv6tQp6vUBw0kO9f0ccyIJAeOMtS4xYrBxUG3Cyj/LSHbKPHrB4/zmSrkOtJqx+6udOf/7O193qdRzdorvsi3AWQJFkTMVAlRVmoWh3qbKKLuvYmsk0WJCS0VuO6cznvNLYou6UAQF5cU2Tk2mPguHwH/Y+xIsivr69S9kq0ln00RSFoeeRJAmNqst87mHoGkU3Ry5vU6u5VCoF3t07YjSa4YcR/9v/821Kbp7hYIJpGpydXor3gqlAQSdJUtburTNf+tx7YZt/8ZMfMlss+YPf/DX++PHbAKRpSr8rvPTbWytcdIfUykUkCWzbwLYMzmczblXWaM97xGnK24+e4y191hsVrm2vYloGD7/3Mbtbq0znS04uekzmS5YLjxuvCgHoXx98Tk4u8a+81bqufeb5bzgFTqdTnvb6mKrKdrGFo1m0531MxcCLPebRHFVSKOgFAbKRNVzDZaPQZBzM+K2dlxgOpwRJQm82Z3u1ztFRmyQRIwvXFaOD9bU63csRv751k7VCgSAJidOUeRiSZhlplmCpFrNoxldWXmfgD3jYe4wu6zSsupi9yl/sPQzQLNvk8zp+lFwhcAVS2NKFj3owC5h6Eaoi08zrqIrE3Iso2jpzPyLLMjabBdYqDpoqrNKVosn5cEk5b/DWwzaWpfHanQaWrjKcCwFfECREYYRbc4njGM3QqK/VqTaKGLZwVHz8pMt8Mmc5X/LtP/4+hVKB5SIgDkOi0UB0cq0CmDmyOMZc2yZLM9RCkb/87iP8pc/v/sOXefC4i+fHxFHMZCgOibUb11n2+2iGhmII94dhGhwdj7m7VSZOUqbTgPOjLpqusbpWwt1YJ8syZk8/orq+wnK2JJsOSZMUwzIo377LbCwOK6quE3gBqqZiWAZJnOCWbCxbxzQFmOj8fEohb2CbKuW8wdQLcQyVeZCwCBPSDBxdQZKgYqs08xq6quCFCTfWS/S6U9I0o3s55frNJt1nz/GWIXEUo9tCpNtoVRmPFrx0u06l4qDIMssgpt9fkDM1CqZCPafRnUX8+gtV/vLTIR886TJfhtzbKqMrEqryi8V4n1vo/YNLgt6MeOah/4PfEVYDQyAMs72neOcj0iBC2tgCOy/m7t022ckh2XIh1OlH+2RPPhKQl+X8s0z27OQAqdGEcplsMhaCvCRBcnJ4HzxFr+XJ79YJw1TYzPwIreGirDWRdJVP3zpmOvaZLyJ27tVxrjeYT0P+6uNLrleFT1oGgjTl5nqBZyczbEUmnnp8f6/Pi//oLtg28WAGhoHiGFiWynwesfXGBqurDouxx/rLLfaf9HHrOSbvPCM72Mf73js8+Zc/RG8UcbcrjN4/wpv6DM4m5GSF5TKmtZankNdJAdfV6fY89j5oE4+WLPoL9HoBWVcZvH909bB9cF2RdpfLI23vkJ2fw0KEzlAsC5qg44JmkM1GAvSQxEiFCkyH4rYvK6Lgex7Bp8egqnjP2+x8/RrzJxdIqiKCe3SVeOKh2IaIyS2XYT75GxW9bsB0SHZyCIZF9O5DUBQkXSHzxOmcOBaHgi/wOpt3uFwMkZB5sXIPWZIxVB1LtWkvOkyDBZqisuo0kSWFOI3pLseM/SWzIMBUVX56/px3L57ScMq05xMuFwueDYfsjzrslkpIksTBqM/j/glhEqHJMo/3Tmi1ajg5C8cxqVSLdK4QuDfXmui6xkef7LPwfJIk5atvvkih6HB20WNwOmB9owm2JmLqgoSN1RpnR52rmb3Mo/f2+K1ffoXucox31d1K04w4SYiShPXNJjd215jNPe7d3WXv8BzbsXj45JCPusf8p48/4c9/8iHr63VK5QLvPdxjPJ6TJAk4GouFT6Pqstqo4C98bMeiN5zw9PnZZ8+2eNX23Ns/R5YlFkFI2bY5HA+oWBZ3mw32hkOWkY+tmWwWVrBUC0MxUK/GIH8dFdu0V/Bjn4E3QVeEN9ot5Xn89BhVVXi0d8K9uztctgeYloGui07SeCLCRN5vH3GzsiJ0GIZD1bbpLYcMgxGTYIKpmPys83MAkixFkRV0RWceLXDUL7ZzBOCkM6PfX6LIEv/tr2yIxqIiYWoK7x9POTubiNZ+1aaeU0nSjPEiZO5HnyWp7Z2M+fRsTDVv4ocJ3eGS+SJk5kVUKiII59npmNE8IIgScrZO+3xEbVWQDN2qi+VYzMYz4jjl1p0WTt7h4qNPSOIEMrj79TfIsoz+aRvmI9RiWQiVswxUHTPviGCb+ZRipUjcv2DrRou9szG2rSHLEnbOxlt4LPojkjihtrPJ/LJLpVnBG00IvID5dMmj4xFv/+yIg2cdaqsVynWXRx8eMO6NSdMUDIcoEO+N4uYW6XSIbuoM9/eZXSnfsyzDdEwCL2B4OcSydeZzYSecTgMsS6VctvD8mMkiZDwPaJUsHF3G0mWiJMPWZHK6jK7IrOZM0iutwVrJQlUk3JLD4bNzDFPnkwfHrN27ReiHVOsFVE0l8ISWJw5jDs+n7K4XCaKEvKXhuiajecAiTBguY1pFnW9/3KWcM9B1hULOQJElDvoem7XcL9w/n1volYKFLEvIqiLU2Isl0TIQ7drJBKOWF0hX0/ob8I1yRUzr9URLfjBAqjVFAcpfceF3dmE6Jeu0xa3QtpHKFbJOm2w4wHr1Figy0WDGcOSTehHmZhUMg/i0Q+pHuEUdx1bZeGkF0ozp0w7nnQV5RWE2C1ElibyiUNZUDs5mZGQs4pQ/e9imrmlEgznS1nWUgolUrqCVc8zmEWmSoa5WWS5jciUbkpSN9RydkwmqpbN4fw9vwDiFLgAAIABJREFUsODmb95A0hRSL8QsWmRZRqfn8ebrLRRV5uJ8Tqfv0Y0iHu6NaK7m2blRxmgJ8UfvaZewN2UyDQkOL0UbPbwSu6UJ2dEh0iuvixGFYUGhgpQvIbk1JCsHsoJk2kKgl8RQrIobvqoKumCjgaQqLN7bw1hxSWYCspB6EYqtE/ZmSLqKbKrgurC2RXZ5KQJ38vmrDaAg3bxD1j5H22mRLZfYL6wj2ZboFkRXKOAv8ErSFE1WSLIES7UJk4goEW/+OIvRZAVFUrBUiyAJ6CwG1G2Xp4MBz/pinj1YLLlVWSNOE7bcGtdKVWH3mk4ZeB5hknC7uopr2hxPxiyiiPu3tlksfLrdEWcXPWbTJSsrFRr5HB/tnzCfLalUiuQdi+3NJuPRjKd7p4RhDDmN3nCCbmpgqsiOxv5JG0wVRZH5+X9+iNPIc9Ybsuuu4vshNbtApVrE84SwbMMtcnTSoVZzRQzr9Q2OTjqoqsJPP9ljNl3y5iu3iJJE/HzNRZZlnu6f8eKXb6HIEr3hhIvLASxj9g/P2dposrPRpFB0iKKY7mBM4Id4fkCvO+L0uIMXRbTnc5I05Ww65de3bjPwJ1iqSU7L4Rou67m1z4rrqtPCj32iNKRoFKlaLo4mDo+bxSKmoXN02KZScwmiWOTaT+bstOqcnndRNZXmSoWKbdPMVTgc90izjLWCSytfR5FkWrkWneUled1hGs7YKDRRJRU/9gmSAEVW/m42599iFfIGmqYwmoe0p6EISUkyMkT7vlg0aVUdajmNRZgyWYaoisRFe0anMyfLYDhcslHLMfFCVEWm4lqYpspg7OF5EUEQc3uzJFTsvQXjqU9ztSSKYLvP5UcfkSYppm3SaOT49NE5/tJHdmu4NZdyo0z7Ykx/77kYNZp54jiG+ubV55WFP54KGp1uMvjoA7AKjEcLrrWKTMaida3rigDyFPJsbJXpPT9AL7poukpxpcZk/xnL2ZKnjy+IwoiV9SpJkjEdzXGrLqquMjnYJ7+1y+T8QhAl955AEjPoDGi8cAun6KCbOlmWsZgsCPyAeHDJsDeh3+6TJCnLZYgkwWwWYJkqyyBmrWKjKRI5XeFa1aSW04iSjEZOx4sTJmGEpkifZaxEcfqZJ34ynFIoFUSgjR8w6M+oNErE8znLhU+5VqDkmlTzIqBsugwp5Ax2mnmGi5hlmHI6Dtht5pksQ25ulVBkicE8IIwTtM+x131uoddeuo19rYGx3YCjPaR8DkVThK965qFV8yTDKWzdhNmA7OMH4gfr9c9ibLEsKNfEbTNLyc7ELZNqVdweTVMU/bNTwZIvlcXvnyyRDY2te030VglpdYWkP2L06ALVtSnXbHK3VvAvxujNIufnC7pRRJSlGLrCKE5Yr1gkWcY4TjBkmUEck2ZQUGT02zvQOUNqNskuO8yP+uQcjVv/8EVGP37MbBrSPp3w8MfH+H7C2v1VnJe2+ODjLqZrMf7ojCxJkTQFc6tKYbfGa7//MvPunHxOEw6SNEWXZBZJKhwE3TmHPzvGujqFLQZLLEvl8c/OiEdL0valeB5OHunefbLzE+FQUDThO5Vksu4p2WIsmAH9c9FFkRUBLVrdFrf+KES6/zoAatHi7KeHxDOP/YMpDz/p4u13UXIG6lYL+fZNCAKy/aci4tbzxIFBN4R9EcRreBU3HA9noCikE8EW+KIX+hulbbbdNW6Wdul5PUzVJMlSojRi6I+p2WWG/piSUWLgD3jn/Ihl5HO/0aBoiTfchlskr9uM/Cljf8blYkpetyibJrYqQC0fdc84nYxIs+zKm+7jewGVSpHru2vkCzbr1RL7F10uOgNK5QK2adBsVhgMptiOybQ/ZdmbgRcTxwnh5Zy1Vg1FFm1B2zQIhkvQZLIMbq42OJtd8tLaKqfTIUeHbUqlPF965RZ/9sP3CcKI0/Mu7//8UwI/5N7dHV6/vcOzvVMs2+Dp0QWz6RJNU9hp1VldrfKPf+tNhsMpuq4RRTFJGIsbfk/shaUXcHJyiWGIWf1svsQ0dB59ckAQRvSGE2q2jaPrvNJc48POIdvFVRxNKKxlJE5mpyxjjzRLOZoeYakWqqyRZRlNu0GUxARxxG/vvIHtmFQqRR4/OaLXHTFf+jx89ykfPz1mvVWnlHe4UxM3zu5ieDVWGRKnMbZqEaUxcZpgqSaWaqLKKmN/hq5oxFn8d7Yv/7br9Z0yrZU8v3GnxqedBStXEatpBnM/EkK6RUirYPDgfM7ToxGKLLG2KsR7jqOxtSX89ws/Jk5SvDBmvZojCBIMQyWX03l0OGQ8CygVTXRdIY5T5pM5hYrL6sv3sfM25UqOk5MxSZyQKzpYjiWCXiZXuiFvKr7CpcgzGVxQXF8nSxLwZ6hODkZtyFLUXI7tnaqYSW+WKDg650cdDNti91aLB99/H0yHKBTz9v+XuvfssSTN7vx+4c2NG9ffvOnLV3dXd0+bMSR7tTsSRS1XlBaiQAErQBAgfRJ+EWEBLSDplVYSQIhL0Q85tqd72lR12fTmehPeh1482cVXM9CstJrep1BAVSKRN/NGZJxz/udvNtczDr/7IdsHA/yXX6FqKtcXc1RVxnIs3HaDzqDDO//R90iihMbWkCpJxPDjTckvX4lCG6VkiVATFOs5RV6A28N/9jl1VeNtYhRFRtdVhkOH2Txi1LGJM8HnkCS49nLirEKWJF4tE+oabFVBVyXu9EyCtCQvK+7vtkQDoqlML6Zcnc1J5jOWP/sBVydj3NGQbq/JvdtdVuuEZZBSVjVxWmAZKpswIytK1lGGn5SvOQJXi5CmpbHwUvKiIs5/OdfkV4fa/Ot/+cdVmKD80b+g/vgn4LpkLy/R+0Jupx7uivjVdgsml8KxzfchiqjHE6TRCKnXg9VcFP44gtUSTk4oj86QtwbUL1+JRDjTRNrbp56MIQgolr4w3bENFMdg+dePkauK1rv7yMMBqg5KxyW7XHD1bI4f5diygq3IPA9iUegbBnEqzGjiqiKtKyxZxi8rdvwNxdMjpDRGrguMjo0+dJEaNkpVMD5Z47oGu9s2T088trcdfvAnj/nwYQ+100A1VIyhi9IQhj7J8YwqTLC2W3hjnywrWSUFm7Lkd+52+VefXOIWYGgydVXT3nHRDJUXr1a889EhdVmhH45EGp7dAG8t2PTtDvhrpNEh9asvxR02uRQt2vkRqIpoDHQDkhCSCKnVE3DZ0y9RmhbOwCGfeTwZ+1Q1+MsUo66xTFDu30MyLXjxnPTJCZK3Qb57F7p9odGvazg/EdexrqmXa+RBjzoIoSqRLAv5+//FN1aD/NPpj/+4qEq+t/U9vlx+ia1anHqXbDX6HK3PudsW3vTTeMoXs+d0LJNVErNKEqZ+wF5LmLscrSe0DBM/SzharfjqakxQ5LQtiy+vJ2y7TVzTZGA3eLZYECaCJW2aOiBhNyw++fwFzabN7ds77HfaFBJ0Ww7jyZLj0zG1IlErMqgy1TyGqqa2VYqyoixK8iAVJFZbJV8nJErNdeDjlzmbLKPdbqJpKt/e2yVWYDpb0+u4qIrC2fE1B7dG/Mmf/4z79/YYbfdIkoyH+yPisiCrSuazNUlRsL3T5/x8imUa+HMfvIz3//EjPv2TTwk04fvgNExGW10kJK7GC773W48osgK35eBaJi3TZJVEdEyT3eaQo/U5h+4uT5bPsTWTc/+aWqo496+oEYXYVm2yKiOtUgaW4ET88Ew8zLe2uiRJxtnRNcgSq41PUVXIiszvPfwWlqYzjdb85ePnrOOEu4MBO86Q2+4tZEkiKRJcw0UC/DxkZG9R1AVhEdE2Wtxu3vvG3sMA/8OPz/84Tgv++Vtb/OWzOa6tsQlSdF1lvoq4tdVEkiRmUcF4HWMaKpsgww8ykqSg3TZRFInTK59RzyZOCybTkKcvZgjBhvQ6Da3Z0FFkmeVKTPqGZaLqKlmSU1UVk8s5g1GbVsfBdQ2yoqbpmkyv5kSTsQiJKQsxgCQBSBKFaqOoKpW/ogo9sEVSZhWHBIXGcp0iKzJRUtDtu+RlTa/XQG608J98QvfufeLLUygLGoMB5188wzm4jeVYFFlB0xVTc1FUrOdrsqJma6fL5PhKqKpuyMzG/n0Wn/6USrXI10skw6J3sEue5lRlRe/ePaqywm7aWJaGaaqvw4DubDXZRDl3+xZfXoVYusLYT4nzmqmfkhQ1sgKjhkkt1cLHpaFTI3F05ZNnOd2tDlmakXkbMJvUSUQ6vSJTGzy8O8C2NFZhxpdfXBIEOcOBw7Bl8mBo07V1wqzi7W2bMIfpJgFJcGb8OOfWoMHvPej/+s542dkUdWdA/fnHryc4Y68LqjDKKS+ub/TemZg2j44E6ztNKeNc/Ls/grfeFyEpmzXF0yPqKEYZdsHzkDptkaj27AX1Yi4uimmiD13yZYj3YsLn/8djeh89wHp0SHq1ojy7Ip/5nP2fnzOdxczXCRdpRk1Np2WwLCr+s/t9LtcJUVWR1+JN94qKtqrQURWevlyznoeo9w/BdZFbTeJXU4qrOZ//3SmzLGe5TDg7D9iUIsFpr2WidhvoQ5dkEZJerkiOZigdF1lTkU2N+dMJLy59ni0jsrpmqKn85NWS7zYt7t9uAeA2dcooI1iEvPdb+6TjDYqlw86OiPOtSqFf39qBV08hCsU+vDMQnWmnJ97z3lDA/JNTAYfJCsgKdVUidbaQdZU6K9g8G3N1GXCnbVHUNW5Dpdk2kS2d+vyMej6jznOMt++gvPWA+tULgQyEnrDDvWngpP6AuqggEmhLFWeiEfgGn0t/gq1anAWnBHmIIiuYqoGExO32LlER4egNGlqDpCz4xXiMoSjMQkHSM1WVb48e8ru33qOqayZhyGyxQVZkdlsuQZbRdWzOPY/HsxnjMMBQFJqWiW0bLG4CWX7wd5/xwbfu0+40ub6a8/J6ymYT8Lc/+oL5yiONM6plAnnJ1qADdc3+uwf4c5/cT4S9LECQ02haSC2d08spQRBzp9NhYNt0bYv1yueT8Zif/OQJiRdzOV4QRDGUNUVR0u252A2ToigpipLT+ZLNOmDQaCArQpHws4+/YjJecHU5A11G32ny6afPYafBnf0Ruq7iug3iKCUIY958eMj11QKAh1sDTFXFVg06psWd9h7Pl6cYiogGvtM6QJEUtho9bNXi0N1FkzUuggtcvYV086esK7pmh6bbQJZljo+uOLuYsHe4BXmFbum4jpDy/dXpV1x4cz4bj7m3P+LDwz2ezqc4WoNxNEaWZCpqyqqgb/UxVI1luiQrczapT/rvgY5+vIrYalv8m5cLDE3oqC1LE+mRssT1KqbrGPhJjhdmnF1scGydKMqoKpFZ/s5Bh3/0zgg/zlmuE6Iox2mKyV3TZHRdwfNSrsYBmyBFlsVrFEVBGqUCbr5esH9nRJ6XLBc+42uPOIh59fgEgrV4doVrUVSHYmUrDw8o51cU8yvxvLBc8KYYO7egPSQOYtbzNY6jY1kayk263WIRcfX4GXR2WD59DO0tSEPiMAFLrBdlWSbPcjbrkNCPcRwR7FJVFSfPL2F1+Q/fl9MjXc7BdGj0e+j9Lfo7faqqwnZs3I5LnubIspAifv2+ybLETsfmi7MVAFlZc7tnoSkSHVtDVyVGLYO2pXK5yTAUmTirCLOKoqy53xfvsWmbLKcizc/a2obEF4NbZxu30+T56Yp1kDKfRziuze07XRbrGNdU8ZISU5NYBinruOTh0MLUFT79TPBj6hq85JcjVL+adf/mbfKzMdJoF+Wf/YGQ2dk6HB5SV7Ug5n2t1Q48iusFaBrlxbVIPNN1sbO/SaSTLAv1cBvp9i2Ri17XolHQdeRh/x8+P8/Jph6bRYQ1cPjWf/0ditmG9MUlxl6PfBUyP17i+Tl5VhFWFR/d6VIDL+ch/8lem9U6paUqNGSZpKrRZYmeptCzNIoasqoWkbeHt5Hu3KOYrlDbNoUf885v7/O9hz3uvzdia2gx0DSuns1JkhLFMbn+0RGTcYQ2dLHuDanjmPkkFN/zOmXQ0Hmz38BRZNKqZsfQ6bdNXp147Gw38PwMxdZpDh0KP8H9zl2Ulg3X19RRJHbydQWbFTRbgom/vIb4JsY3vtGuq6pwtDNsKDIB2Rs35DhZQbJM4pM5xg3JZeqnHDgGRVEjfY3E1DXS9i7lJhbXaO8Q6cEb4vWdlnA7s2zhdb9ZU6U3e3nLoliGgo/xDT777jYnmwvKuuSd3iNm8RxbM9FkYYUqSzK6rHPqXRLlOd4mxNI0PD8i8CMUWSYtM3RZJyoSbE1jd9jlja0BUZ6TFAWmKnKsB7aNqaroqtDfe5uQJM05ONziD3//d3h5dMnl+ZQ37+wRBDGrlS909oYOdc3ttw9AkphMV+w82mXtBciNG0JeVaM0dGjrKIrymvmeZwX3uiMeDXZYRjHbO30CP+L733+few/2ePfRHZoNG6VpcH21wPMjdrZ6nBxfc3o+wbbFZJ4UBWmS4W1CFEWh02+xvTPAdCwyL6HdbTIcdjg+n3C4v0UUJjRdm62tLpIk8caDfQxT59zzSIoCWzOJi5xFvEJXVFRZYZmsmMUL8irHz0I2qY+tWmRlhqEYBLlgWluqSUt3kSXxeFotPSxT/NwX5xNkVycvCjRVYeQ2aRsGb/YPUBUFRZJ4s3/APz54CEDH6LJO11iKSVImBHkgAok0Ibmr6pqszH4Tt+avdd7YbTNex4yaOv/hwx5BkmMYwpVNlmWSrKBji+nz68x6VZHYrCOiICHPS9K8wtZkirKi0dDodKybSV9+bQZTlhW9noWiiPWQpsnEQUzohewdDnj7w7tcnkxZTNcc3uqRJRlZkgnE1mmDrKBtHwKQbnykwT7VcgyqLtaMuiVUWZZLnuWwmlCWJZIsod8k0fl+SrNl428iRm8+oLE1YvTe+5iODc2+kOyVObZjM7+ak14cY1oGqqay2Yg9f7AOKNdzjMM3MPpbgsRc5qitDo1b9wg3Ppqh4S09JEkiiRIkWaI3bCErMstlTJYVNC0NVZUZr2NUWabj6PhpydjPUGThN58VNYYikxQVhiLj54VwsbNUVEUSYXC2MBxKItGkxOdH0NuDIqPRdpFlCVWV6TaFsqAoKgYtC9cxMDWZQUNlGRUoisTxIuZykzFbRHzw3h6urZEkOeN1/Evvn19Z6Gm3Sc8W1NcXkCbEL8dioqtrqjhH0jUkx6G+PKNeLFC/+x5cX6N0W2LaWyyoL08F015RROFwHAHTO44oGHFM7Qm4nziGqyuoKvStFjt/8CHmXhfSFHV3gPH+Q7KrJXVW0N1vcedBV8j9gOMLn6CsaKoKSVISxyV1XfNllGDIEjVQ1vDST1gXBddZxvN5SO1vqM9OKIMEtWniX26Yv1pwcRHwVz865fIq5Nv//X/A9p0O994ZcvbxOf03trgMUjZfXpKcLZj9+IjVOkUxdXo9E0WV8LyMpKowb+IgV5uU7aHNcpkSRYUosKrC0fOlQEh6rjC6abXBskVYz+6h2JUPdoTULYkF+Q7Ee5nEYtJfTqiLHBoudRpDGlNnKeUmwNhuow9dfD/joG+zv9/k1p22iN4tS2g0SP/k/0J9eFt83fVC6OcBxhc3pkdtpEfvUnzxjDJMRUOmqpRBCnt7/y8fYf9uT9fo4GcRZVWiyArTaIF6Q77y8wBFUlAkmU/Hx6zimH/y8C4/u7xk2HEpipJxEHDmXbNKN6iySteykCWJtmnSsyySoiApCoIwZhIE5GXJNAzxwphW2+F33n+IoWuM1x5v3T/g7QeHfPbVMaqq0Gk32ep36LWbIElcjBegyq/93KM4Fb7gqxSlaVBuhNrBO19RLRPS8w1HLy+J8oTHsyuiMCbPC9Yrn8+/eMXFeMEnf/Mli7XHf/eff5/79/f44Fv3+fKrE+4/2CdahZweX+MHMU+en7H2Q3RDNIVVVTNdrEmuPbSmSVXVhHHCoNsiiTOiOCWKUtI05/R8jCLLdHstepZF17IYNrpkZcm9jnjoH7p7GIpBWZf4WYipGmzZAxbJElM1yauCrMroWwPyqmCRLEjLlK5p4roNhqMufhixNerx4M4e9w53cJo2fpYxsBv8q4//nu/s7DGPIo7WV3RMF0MxuAwvCfIQV3e57d7m+eoIAEVWiIqYtMzomb3fzM35axxbl8nyknVc0LFUXhwtieOcrbYgAtc1TP2Mk2uPIMjY32/x/NWSVtumLIW17OUyZB2Lqa9hahSFcNjrdi02m+S1+9vp6Zo0LYjjnOlEEMYevH1wY8+Rcuv+Nvu3hnz+8RGKpuB2XTRdo9FsgD8n9zbQaKE5DeqNmKAxGmIYkRXwVyArVJMTUHXq2TnpRjS983lIdpPVkUQJ04sp4dkR4x/+NcnVGX/wRx8x3Olz5+07TM/HmA0Tmn0W4yVpkrJZbMguXqKoCtguaRiRximsrqAqKeKYcOWhaBpFXpClmWDmS7CZb9B1heF2h1bLFGEzpmDff3Crg6UrdC0V6wZRmQeiudp2dS42KbYmE+UVErDl6ORVxSoqOFmmOI5OWZZ0Bh3BT7BcNNPAGO1TFAVlWdNsGnz2ZML2lhierxche/0GcVbxZBLhJSWPth1+70GXz05XFEXFsGWyDkQ64RvbzV96//zqQh8E6NttWC6FTK6qkRQJFgvKr9n3Tgvp3hsQhjAeg64z/5snqJ2GcMXrDgQbP4mpnz6GJEFqdajncwHtqyrZ1BOSLc+DGytZZTSgHM+QWq6YXEej1/I+rddkc+kRrmPSrGTUMLB19fUP9ItVgONolMBd00BGQBu2LKNKUNQ1LVVBk8TPEvzZzzDu7XL8wxOSpGTrvT12dhr4ZcU6Lwl//BWba5+zZ3MWm5S//vMX6JLEp0drpidrDEPhzd8SBc+yVNK0QlFk7vUb1EDL1amBzy42LFcJu3c71DfmBnff6oMsC68CwxCFPPSRegOYXoFhCtipLEXBb3Zg+xaS20Pauy/IenkG6xn1aga6KRCBYIXSa1PFGd7xnGZTpzuwyYsKbcsVHvjb25TPXqJ1G0IhUZaisXA7kKVw9y2xq+/0REMhSyBLSN/57X9AcmazX/eZ9f/rKeqCkdMjLVO8zBdkOaOJLCkEmXC5A/ho/03Wacrxes2W4/D3Hz+h03UZ2DZto8np5oqiKjhardBkYam7TlNahkFWlqxXPnleskoSOqaYknb7HZZJIpLiXAdTVZGB4aiLYWhsvJAwSji9mjIctEUoRZSL5Lmn12z12xi6Dn2Tqq5Ak0Vz7OqgSEgjG6qaF8sZv3h2wt6gyy8+e0GaFfyjb7/F7lYPkoIszvjJ82NOjq/58skxl5M5f/UXH6M0dM6uZlxcTJEkibfevEVZVrSbjZsJQ2H/0T55mNJ2HcIgYfxyzGy5YTTq3tjwquzvDsV7XZTsNkXK3MvlGUPb5TKY0LVaeJlHVVfsO7vcb99lz9mhb/W47d6ia3So6hI/85nHM5qagyprBHmIpihIssRq4dFqNuh0mvh+RG/Q5vbOkEeDAZ9PJ6iayt+dHOPoOh2zycjeYhbP2Wlso8kqsiRTViWObrGINxw4h9iqaNqCPPjN3aD/D09R1Zi6ysRL+cV1hGVplGVNXlaU5T+QsG7vtIjjnOtrn27X4ujJKZ1eE9cxBGls7Atv+iCl0dDouyJi1TBUoT3fCLQwinJc18C0dOHuZqnCxruho6oyiiKxc2sk9sNrH1VTCSdj7HtvC6vzYEV+dQyA5raQDAOcrkAKq1JM905PFP/+HiQ+y2XMcu7TaplcHo+Jw5iDezs0Du4Iwx3d4k//9AuuTyccPT2HPCOYzpEMg3Qxw1/5qKpK6+E7mLaJot/4jNQVxp1HIMmolgXLS8rLF6TLJVt7Q5I4od1v0+q30DSF2XiNaSqoqsLST9keNHh86dFxDKZBTlZWtCyFBwOTuz3x9x/fadG2VJqGwizKON8k9GyNpqEw3ogwIqflCLOh9haq2yLfrDEsg+FOn0dvDBiPfYqi5NmLOVtbDqOuzbd2GsR5xQd7DroqY2sipKuoal48PuW/fDQkSgs0TeFs+W870bdaaD2HermC6VTskasaJOGBz+6uMG8pCvHXMCCK6H/0gGIV3rDubVEobk49mVL//GdCVqcokCSCBGdbQhLmumJKrGuUpk38+ZH4OtfXpJ8+RbY01E6D/ru7jCcR7ZZOkpachQltVSGuKhqKQhjmXGc5flmSVBUvk4ysrsnrmhKwZBlFgvmf/BRt6FJOFsRxQXerweara5Kk5KODDg1F5uj5ktPrAMfRWRclbVXhwV6TVVGw8TLMoUt0vSENU/K8QtMkTFOkKemSRF5UXKTi9TdliT8NePVkzuZyw/h0A3GM9vCWgL/MG+i90RTs9xtrU6m/i+R2kToDEfMoSaIB0k2ke+9S5xlSoylYrlEAWYL09rtEE4/GwKHIK7x1gqYrIoio0RAe+MdzkUI4Gol8eRBf+/kX8MVPwXZEI5GlArH/1j2xOkgSkSoo/+pb6Dd9LNXmoLnLOtvgZR4DW8gb0zIlK3Nc3eXUvyAtMoIwxjUMpmHIO2/fZTpZ0jEbuIbDbnNIVpZkZcnpdMHfnr1kFcdUgCxJ9AdtHMtg2Giw02xyty/u+bKsODkds4pilnHMJ0+OUFUFt+Wwu9tnfD3n1u4W87VHOg0w+zfKFkPharokufaEb0VWwiIBVYashLwSHzMUnpxc0nRtxmuxRtna6vDjT54ShDGjd/awHYsvvjzi7GpGq9mg8jLQFO4cjCiufIIoYWu7x3LhkaUCxs6yHEmS8IIIDEXs+eMC0pLl0mO59Di/nDIeL7m8nuNFMbf6YgXQMm1M1WBgC0KdqRo4msPI3qJjdGgbLdpGG1VScTUXTdZ4q/uIoi5paA38PCAuYlbJmn96531mkxVuq0EQJiyXHqoqXN5ahoGfZZycT5Akib12i+/sHKKhuw5eAAAgAElEQVRIMmVd8XRxzF+c/R1NvUlWZSRlQktv8dHOh6RlgqboVN9wjsnXZ+hofHi7w9xPmWwSsfM1VZSbhuzWVpOFnxImOUGQ0m6bLJcx27d3GV/MMXWFQctkt2eT5yWLRcT52ZpX52tkWULTBGyuaiqdjkWnY2FZGr2ehW1rTKchr55ekiQ5YZjx8uklrZZJp+ugairx5Sn2cItosQR/Ac2e2MVrJvlmTX35XBT3ugZ/Lgo8iGeNtwBDBMC0Og7LZUhZlIwORlyeTATcffAOstOivHhGsRgLv/o0hLrCaTkweUUVbNBNnSiI8Fc+pm2CJ4jg6XoDmkGRpIIIWFeQxUwupqRxyuR8QuiFeF7C7XvD1777eVlh6SrGTaBNy1IYNXXe23bYb1l0TZ2qhpau0TZVfnuvTVNXGDY0VlGBl4hchfcOOyyuF7T7bdjMKOZj1GYLwzLY23NZ+ylhKNAF1zVpNUWIzSzMSfKSP3uywNZkLjYZF554Fv+LP/oO12FCuyGsetV/W3kdmw3SGw8FRHvzQFfuHkIYovebYgLvDgRhq9u9MXZpUa4D9C1BPKsvT0VQii/86QsvFhIyTROBK3VN9GwMt25RP3smOrD1mmq2IHpygdK0iL84xv/hVxh3d0CSePGnT0hOZuzuNLgeR+RVzVDXMDQFv6zoqSqnSYqjyNiyzKoocBQhSRhpGq6i8EWYUNVweRVSBgnFOqLl6mRRhqxIlFXNi2ufbsvgKEx4dL/DcpnwvQddnscpf3u8ZEfXufeoj6QpvDgWuufWm9v0uiaNhkZZVrz99pCrjfhednWd9+91yIsaPytpH3TYfWNANvMoXp0LHX1V38jlClHoG46A6YtUdMJFQZ2EggF/YzGKJCE5LTGF2y5SZ/C6+bK3XMowRVElNFUgInKrCcslydMzJF1BUmRxLfpDkVEwvhDpeP6G+u//UpggPX+KMui8Dh6S3vs2kqa8bkS+qWedrtl39lmnYqIE6Jk9gjygqTeo6oqW0cTPIna6bRRJ4qDVIolTdnYHzOOAc3/C0fqKpBC7N98TU48qy3RNk7yquLqYcb/b5cvrCZoss04Sji6nTK4XOE2b68sZl1dz9g9HJHHK3/zdLwiCmHbP5WoqnCf1QYOyrCjnEVrXBj9D33KQNBlWGTQ0CHMa3YaIhB3HEBW8PL5CkiSiMMGxLTbrAMPQXwfadNtNqCoe3N0liGIefXgPZjEvfvYK86DN7nafsix5cXRJr9/m4NaIXqfF/nafNMv53odvslyLe03Zdrh7aweAcBly7/4eu9t9fC/i6fk1syhCRnoNjVuqga1a9K0euqJjKAamYqJKKo7mUFEhSzKqrDK0BsRFQlNz2GlsY2sWQRbTdG3WKx/bMlAUBV1T2WoLIuRXk+kNdF0zbDTYdvqMGn1OvXNUWWESevzF2d8T5CHLdIWpGORVTlImHDgHry2Rv+nncpPx5lB4qNd1TXWTmJbmFe2mIdQ8N7asnY5gjLfbJmVRcnBnRJAUzL2U83mIosioqkLohxRFRV1Dq2UQRTnLyZJWy+T6+sa+2EuZjH3CIKa71WU2Xgv/90GHydjj5U8/E+slpyN256oGbl8gkzeMe5IAdu4L8lm0FlC+rIhdfRLAZgLra6bnY5HkWVZIskQap+RJSpllkKeicLe2sHcPqIIN3fv3IY3wH38Me48w+wM0QyM/f45hGWiGhtQeoGrCW6Q52oLZqWg0egc09g4p0gwWV9x6sIdhGSyma87Plvi+cHxs2zrrMENTxX6+bamMGgZ9y6BrCOJfx9CoqekaBk1dZWAJu+2GIfP2to0iS5R1TbPTZDVdCVJho0VZlli2ju9nTCYBSZig6RrdrsXtrSaths7lOiXOChZewt8+mXK9Sfj8IsAxtdc8iv/mO7tIEmTFL38W/2rvxzSFJKEME9Q8F9DtZgO6Lgq+LItpryhE4V6vwbap0oIqL9FuYHo0XVzU62u0+4f/oK+fzajWHmrHhvkcaXubOgxgtSKbeqJZqGuU7TbGnkx2Mia9XrN7u0OyiVFVmUZDY7mOGdk6kyjju4dtfnG2wVUUNkVJCURVTf/GwOA6yzlPC2QJFkWBJMGXn024teswemeb059fsHung2YUNMYR41XCQNP4s8dT/vl/+hZ/82+ecdcy+LEX4Sgyl0crymIpYgsPeuSLgCQtKQsxKTz+coaryBS1jF+WHJ/49Lomd3cdgqsN9tDB2tuB3V2kpitkIFVJ7W2Q2l3BvL94Rb0Q0HodR0i37lNbttjRrxfU3logJ9sH0BlQz69upvCEKs1ZzmMR5OPomLf6AGSTNenMR2uayKYOi4Vo2pKY+vxUNA+37yHtHooGY+DB2ZmYLhczpPUSSZW/8Tp6TVaJikh051VOVVdUdYmtWnSMNkmZkpYZYR7R0DQWcSysU6uKwI/o7OxSVSVN3UJCsPDfurdP2xBQaJjnBGGMaRks4pi9TotJGLIMRTxtq+2QJBndXkt8znyN78fc2R+RJBm9tksYJYyPJli7HTazDQ8+vMvzF+fQ0MgWESTlTcaCBJZKOPVhnQl73KomiVOevThnb2fAG28e8rOPn3Lvzi5VVeEFEbPlBip4/IOn/O5/9RF/8a9/iLbdJL/ySRYhF9qM8lzsRVtNm6vxgiRJSdMMy9T56SdfCWZvU6OMc86uZowGHd545zbPn52xvd2j03V5s9+nqRuskhBDTfHTlJHTQkLGUi2uyzE1NXGRsNMY4WjOjZ/BihP/FFmSsVULS7WZJwsu/Qmu0SAKE8IowbFNnIbF/sEWTV3nfONxfjbFtg1MU8dPU5IiJcxivlpcUdU1h60ub/buoEoKqmqTlAlxmVBUBUkZk/17EGgDIoZ0neREcY5rCefAqqrJqdjuWEhI6KrCKhSQfBBkFEVFnokJ/937fbw4wzE18kKQ8Xb2+7RaJpomE0W52Ap1XXw/pdHQSZKcLCtJE1FkLUsj9MCyNLxNRBImNLZ3UTWVRqdNnuVk5y+EQU60ofXgTTYXV6Kwzy/ERA9ims4TMcnLipiwb+D+8y9C7OEWd9884OjpOaqh43ZdluMl0clz6O0RffUx93//n/Hib38EdksQ/WKPxJuSzFywXNr9FnGUCJg7K5BtB38yg0ZHfH6REq7WqKZFPdzn5OkZbq+N5Vjs77dF4cxKpuuYzSYlzS00ReagY1DVNes0ZxKmHLoWHUNnneakZcmxl4mUyqJCV2QuNxlJXhJnBXEYUywnSK0+im6xe2tEo6GzWsVcnVzjdl3clkVV1Sz8FMdUORr7lGXFqGtjGypD10CVJZ5dC8fDo2XCnTsNVFVGV3/53P6rC327LYq9JAkzlbwU/9c0QcjStBv42BC72rKkmi+FJvzWroisNS0xGa5Xorj7PtXVmCotqLOCfBmgOOZrGF/q9ahbLRRng9p1yK5XTD+/pDNoYOx1MfYEy1ftNrj67IqNn3Gn3+DFLGDL1PnqwmNL1zhPM1xVwStKXEVGkyQWeYkqwaas+I/bDV7EKfO8wNB1qgrWL6b0uibj0w1NR+PebZfTU5/TJOW7ey3ydcijPZfPzjf0NIW8rtnZc3ny1YI33+hSLAPqoqQ7anL2YokfF/TaBptNyqoo2LEN8hs7yiyX2f/WjlAnlCUcHcEHHwp5XBQiPXgkViGXpxAG1HkuEJQwFFK37R3qwBcfqyooJhD4QhPfFS55tech6SrdvgVVjdo0kRsW2cWMfOKR5xWtB9sinKbbpT5+KZo2xxFEyuk1UhxSv3wh1imuKwKMzs+p+32qOP/GW+DaaoOiKsirgrhIKOvydbBKU28S5CFeGiJJMuebDVVVk2U5aZrzYH+Ermi4usPR+powzzFVlTjP+fLiGsMU6XBJnKLpIqM6ynP2Xfc1i920DLIs5+xsgmXq7OwNUG7S2FRV4fj0mjBKaG632FytOHxjj5enV+iWTraOxT5eyqAGghvFQ16BIuHe7uGdLCHIya2SJE65PJ9y62DEy6NLuu0mh/tbfPX8DOKCzv0BcZLx8Nv3ePb4RLD5s5KdUY+Xx1e8985d5ksP09QZjXq8PLokywsGnRbLTUARpbSHLTZ+SJblTOYr3n/3PlmWU1U1j6czvru3i6mqrJOED0b3cHQxlXuZhyIpzOMVi3jDKvHYa44IspB5vKKoRKOx19zCy84B6Fotni2O6fZbSAuJoigxLYOOabKIYxbztQgT2R3wRq+Hqap8Pj2hqev0LIuniwVdK+bn4yeoOypblgi5kZFYpks0RUORFHRZ/83cnL/GGTg680AQ6XRNeR1o0zBVmobK8SzE0GSyrGQ+jzAMlcVcTOW2rbOOMjoNnVdXGxRFFARNU5hOA2RZYjnzKPICp+WgKBJRVNLtWjiOgJ4lCRbTDVmacXU6pb99w6W64VBMzl6Ioj08hOkp6s5dNpO5MPtKfOjuQLASxl9lIch5SQBpiHr7XYqpuOYYIvv9/GiM5VjkaU6e5gz3h0xX1wKK33uL6fVaNBJf/hx0G9IQ6/YbxNMJ7uFtQj8S3vamQer5VJ6HsbVL6sngTV9/P4WsQOQxeusNVrMVnb7L0csZD94QoTlVVfPoXg/X0jA1maYh42U5l5tcRFznFdvNnLgoGfs5UV5hqjIDR8VPBXlSlSVWYY7VsFD3D4n8CMMyUBTRrPnrAEVRcNwGu9tNgihjtolZ+BKaJuN5KSsjI85KVmHG+4ctuk2DuSciqv29nDyvMPVf7vD4qwt9ccMOv8H+td5NLn2zCWGI9K0PIPCpZxOCnzxHaZoYux30wy2hvY5CmIwFJK3f/DK1WrBcQ1VTbCIoawHJL5fQbFLP51TXE6owJbg4AaB30EaxdLLLFcZeB9nUWPz8FN1QMDQFw1BoKTfGBorCRZrxoG3xfB3TVlVWRcEkK/BK4X/fUWVWRYksQVrVVNT84GRJR1X46Nu72MMms+MVnYbGvbstpl/N+fTSozeJaKgyNXDH1HE1lcdfzcnrmlcvVuiazP6tFnlaYJoKiioxWSdoksSObVCWFVtbAspJkpJ85gsL2q+LpSRDXYnQmDyDzVIU2V5P6OsHWxAIb36SGKk3pAYk26H21khum/B/+t9p/Ld/JEh5N0Q5fehSBglazyF6ck6VFmhtG7Ouodmkup5QvjpH1hSU3S3hu58kUFWi6XjrEWSZcLba3hbXarUSkP83XF5XI+AsWZJoag3CIqKoCizFJMwjJCTm8Zqn8ylfPj5m2G8zGHboNBs0dJ155PNkNiavKjRZwLyWJiaqxXxDmmTohsbudp+T5YqmZfJ4NmPlhayWHtPpCkWWabkN3FYDbxPSdBvYps7TZ2cYmkZIwmjQIcsLNn6IhEQWZ+zf2+H8yQVyz6TapGI/72dQ1GCpeJONmPIroZE/PrmGuOCDjx7x7jt3mY6XWJbBwc6AY/+C1WTNj/3HKLKC3NCoLJVht8XzowuqrOTV8RWyLLO/OyDLckaDDrPlhulsjaRKNDoNojhl1O8iSTDstdncJJkNei2auo6tmlwFawCiIkaWJLwsoKaibbgcunts2X0UWaGsSjpmm6zKcXWHtMxoGy3+lyc/5A8ffhtZkjjzNmRpTqfnEvgRg0Gb0+lC2PYOOyiKwoNul2c3CXYAe65Lx7LYhDGJW6ArCqfeJS3dpawrmnqTvMoJsoCiKgjyX57j/U05+o1M6+ui2+1YxGlB1zFYxzmWoeBFudi9v7qiN+rR7jRuSHU6YZKz3MTEcYFlqZRljWEozOc5WZKRxikNt0HTNZnNxGrq6sojjnM2iw1lUaLpGrqhoxkiZ0BRFCxb5/TZmfD98FcYZpe0u4NmaBSJIIcZWzuk45tM+boWE36wFJO1qlOsZmJnX2TI/V3ik2cgq7B3m9sPdjh+fiU4GfffYHNyApMjPPkutbcUaABAa0A8F8Ff3tELpO4IwxJe9mQRNFpCQ2/Y4A5BktH6IxGS094lDmMMU9gMd/vN14E+klThRRmWLnbzSV5jmjIP+hZhXmCpCmUNmiwzamposkxaVji6yv/26YTfudcTXvTXQsb3dXhOb9gmTUtmY5Eu6HZdtrebHJ+uhF3uKuD23QG9tkUQpLiuQSqJPIurTUaUFtweOiyDjM/GAWlaCAOdX3J+9Y7+8hLqGq3fpA5C8oXwTEZVqWdzuL4QO+NXRzi/9VB449+/S7X2RKHwb2xSDUM0DTfIQJUU5HMPfbuN9XAkLr7jUM8XUBTI21uUQYq+1UI2NczDPtrQxf7gLlVa8OLPn3NyEdzoE1XCKKeiZme7wTwvuN+22IQ5NTXromBdlAw1Fb+s6GoyDy2DkySjoypcZgVXWU5c1SRVTeEnmLcGaJqMeWuAvtvhWwctsqqirMVrvIwzZCQ6bYOoqmibGklZcRVmyLrKapWi6QrtloFriIhB21ZxHA1vk6EoEsPDFpGfIumqeI+yTEDxIGR1g23qqXAJlEa7ggnvb4S5zXJBnSQicKauoeEg9YZQ11h3hgJpsR3Y3sa8v0uV5NRVjbLVI1xERH5Kvo6QFJns6BL5cB+1aVL4CfGnL8R10zQkt4W0ewC6SZ3ESG++LdYzh4fUG48qzYXu/xt8gjykqit6ZoeiLlknPn4e4OouRV2QVyKi9nq54Xe/9w5VWfFoOGDlh0R5zjgMCfMcRZZRZBlVlkXGfSkm6P6gTX/QJq8qOg2byXyNqaocDrrkRUnvBrIf7fTY7rXZ2+7jbQJ+9NPHXE7m4pdfUQijhCzPaTZsyqxgb28oIHdTofIzoQ21VYhL6BhIAxPWqZj4o4JiHQuI39bIspy7gx6mZWDZJlvbPd587y54ImLzzuGI6tRHkWX6/TaVnzEcdgjChNXGF/7fYUIUp7iOTdO1kSUZTVNfM/KbTRvHsYjjVMDIdc3V2mOdhtR1zXd2HrDnjDjzrjn35mw3trBUi3E45dX6nGk0Z5lsuAyuUSQZW7XpGG02mUfPsmjqDqZisu04vLW/TRKnNN0GrmGwWnpEYcL1lcgbfzKb8f3D+zdZ5iVfnFzy2fUYy9TJy5Jtp0teCkSna3REkIliMYmnhHn8uhn8Jp9PTteEWcneoEGcliyWMVlRoSoS001CmBRM5iGbdcSdN/YJNgHttkmRF1RVzWqVEEXF6/28qkrEcYGiKER+RMNtYDs3u+WGThKnqKrC9naTuqrpjXpYjkWn77J/0KXZNPDXPqeffQWbGYZloHYHIkymEEROyhLNaZCGsdjNa4aoAZop4HvdEkXXnwsI3nSozr96rbdXFIU0FYl5pm1imAZ6f0usDzdzGjt7gpBXiAhdYk84gtot6sUVqqaShSFqp4/VcgW64C9EhocsvfbtVzWVzWKDJEvIsvAeSFMheXt4q8NO1ybOSs5WCcOGRlnXnG0Sfnji8/PLgLN1wrN5zDouScuKoqqJi5Kdrk3TkNFukgKHoxaqJtIGLUvDW/nEYUyWikbL81Leut8nSXJM2+TV8wkvjpYYhkqaFqiyjG2ojFcRHx62GK8T9rsWPz1aoSiSaGp+yfnVE72qikLTaVNcCtKLpEiiiAMMt0HVkJoOFIWY8L4mZ3U6onjJMpLrirzz9Zo6zVC7DmqvKaZ8SaL2fPL5GP39N0VwzaePse4NmX98yuAPP4IgIPzxU2RTxZsE7N1q86PPxgzTkigqyPOSoq55ce4zywvuKsLmNigrNEkSu9Sq4qGlocsSk1wU9lle0lJkiromKis6qsKfPp7yzmVIu21QRSk//JsjvvveNurJit2uhdXQGOqC3f+La48DUyfPK5KqJq0qlmOf+TrhrYdd5tMIwxBwSsvVUVUJvWUJR7q8xBk10fpNsQ7Z2RH6+bMTpHe+C5HHDUuG+rNP+Pn/+CO8siSrat7eddn9/kPyuYfWd/H/17+k8WgX+TvfQ7p9i/riFOne2+L6OQ6KY1L7MdnpGKuhUeYldVHirRM6hkr58oTLn50xemsL6prkySnm3RH1bArLBZLrirjhwKOcLlEsC6npIKkKUrv9/8Fj7N/dCfMQQzFuilGFKqsYik5aplR1RdfocKe9h6y8YJ0kOE2LMBfuWI6uU1QVhqLg6DrjIGATJ2RZjmWb3O+4VHVNDSRpxmy64r27B6iyzCevzji8tc3Rq0v+yQdvchUE/OLJMbZt4PsRd2/v8vmPnyIPJeI0JV/e6PwXa1ilFKOSLM/By0CRxeQelzC0QIJ6lYqs+kUCuiLgfABH48sfPbsx45FpNm1+8sMv+eA7b4Cr4TZtkfxmCVOfJ5+9whk0SfOCPPmacV9wPV3y4O4eq5VPq9nAMnR6vZZwEbNNTMsg8CNUVaHdaVLWNR/s7WBrBrMoomO0yKucuMjpmg0+nTzhX/75D7i+ngPw7rv3+N1HD5hFEQPb5n8e/5hb7Ta/f+e7/NM7H3K6ueTDrW+hyDI928ayTWRZ5ny6wG5YJHFKWZQEfoTTtPnp1Qknx9e0Wg00TWU2XbG7O8DPMl4ux7RNk4vgiqE14Dqa0DM72KrYbTfUxm/m5vw1jqWrXK4SdFUhLyoMQ0FTZcK0QFNk+je72+k0wLJUOoM2cVzgNE2KosZxdIqixLI0ZrOIIEhvir7M7q2t11SbOM4J/Yg79wYoiszZ2Zrhbp/1wuPugy0Wi5jnTy7RTV1YxQ63iK4voYbC90SRrkqi82OQJPKqFC6bigaxLyb3JIDunijMm4l4YX8u4HwQn6NoROfHnMcpTtuhrmumn33C4N33mV3839S92Y8keZ7t9bHdzPcl9ojcM2uv6uptZrp7FkYDDFwkuAgheEJC4okXBH8E70iABC+g+8hFgBjBTM/cWbt7prq69qzcM2P38H01t33j4WsR0SPdKjQgDYlJIUv3cLfF0+N3vsv5nqOBIcIyZAlYVbzTQ2jvSnsgCcFfsp6vYN5Hqz0gmM2xWk1i25HRO11DVVVM2yRLM+qtOpZtsZj77Ow2sG2dxSKkXZW2R54XtCsGf3245I/+/DmzV69AN9m6f4cP399htoroNCzO+i6VisEPHmzwe/da/OWLOW9uV3nWl/+DxUKn0WkwGbvUmjXR2AdpUyQZh2dLhs8Paezt0urWWUxddg86+H5yJWh0Z6+BF2VcTD1p3TgGRQEbDfsbvz/fDvR5Lgz5IKBIM8ytBnmUEn99jH1nE2YT6cGX1qbqwiX8uy+x/+BHAvLDoWR/L1/C7i7FdIby5hsUz18QvBph3eiQTFzyMMHabQnwJwn6zR2y4YTOB/sUFxf4jy/Q2xXOHw1ptyzOjxf88I0Oz14t0RXIgIap89D1+Y3NOn8zWKIC26bOUZhQ06TcXiCl+kWaU9VU/Cxnx9QJ81yIdVFCS1d5sQrYCBI2v7MvpjQzn5qm0agbfPV0xoauY6squqIQlgGFrSo0dY2veiu6hs6nT6bc2azgrWIcR8c0VZarmLZlgKYyPV2w9dY2ZDnp1EW/qZP82V+iv3UXpmW7I0nAMPjf/us/5Ykfs84KGrpC7yiBo79ly9BZZRlnUcrmZz0e/PMv+d3/8j+mWC3h5DmK7VDUEvI4JXNDjHaN0E+k7+OKR/X0fIn3bMLWVoV0GVD54Zvko4lUGYKgtM5dywTG7Xto778tegm6TpHnrz0ZT4DdoqpXWMYr6maFvMg5cc/p2C3m0YIn00NudFqoisJi5fH16QU/fnCHdRxzvlrxRrfLq/mc3VqNZRDy/v4ujwcjHj894a03b3Jy3KfeqFKtyVx2kmXc3ttkGYbcuLnNy+GEfn+KrqmsVh6qqrJYrLnzzg2OTgcYpeWqaej4RzNufO82Z1+dytx81YB5BFUdrHI8CQX8VJ7LCuhYMAnB0eDCg7ZF72SIWbe5d38fdBXfl9G8Rq3Cw4evRGFPVckcHcsymS1WaKaOqqo8eXaCYRo8fXnK/Vt7jKYLikI0tYMgRlXleGEYs7PTRdNUfC/EbcT87OkrtjZbDLwxmqqSZBkbzRb/xX/3P0DfvwpIvvKf8tVHT9FaNpkbwTiAjs0/a/01/+N/+p8xCRa8XB7SskQExDD6LBdrWu06rjtFURWy8lgvnp8RRDH1qkMcJRzsblAUBXPXo2VZeEnCIoo4W62Is5Su08RPA0zVRFUU4vz1V8bLiwKjHPGarSOadYsgSnl2POfGTp2ZG+FFKZ1OBdeNyr6vyu5uXdj0M59Gw6Lfd6nVLIIg4caNJkdHMxZTn/tv7vDVJ4e0t9o4VQdVld5wvW5jWRpxXGEwWLOcLkmiROxVw4hKrYLR3SIaXQhQZwn4SwC07VtkvRew1qVc75XW2pouYKwZ0te3qlJe103xrq+Iih15RhYELGdD9j/8AFRNxG8M0Qrxn3wix7VrktlPz6VSYNdlZn9yBoZNNOyBYRH1z8CuoTWqBOsAu2oTh0Kec2rSPs3SDM+Lefb1GU7NoT8PSLJcSuWbFf7b//6vYHYu12k6jD6f8KefIxWJy7FBq8qnlSb/0X/ybzBxIyxDwzF1soqoFi5nK5qdBp5bivkApmVy8mJGvpyC6ZClGUUBB7e6zGYB3W5FCMVJzhdPRgzmdaqOwXARCKs/E7ndb9q+vXRfr4sC2miG8b33yNyQIslQbYPwZEIxm8oHm6awXpdgUsX9n/6M4K9+RR5EcH4ubnZBgLK7A/O5jOvlBeHhGH+wQm84aBsl0C+XKAcHaBWLIiuI+wsUTeXjn5+wd6+DH6Tcem8b1dB4426DGwc1HtxpMAxjqqrKV9M1B6ZBSxfW/YGlY6sKTgnMpqLQ0VUqqoKpKqyynF4s/XpdUVhlBbumwXkU8+rnR9zerfL16RJNgeenLklRUNNUDjYcDto2jq4R5TlellMASSEtAFtVGEwD/CzneBEwn0fUawahF1MkGY6tE/cXxBMXioLobz5B32hIv34xu5K5jb54RpjntHWVuCjIC/BLrsHLIOaZn5AWME4yXoUR3tkGvZ8AACAASURBVD//YxTbBlWlmIxJHz4lma45fLXkFz99RlHAy6MVR4M1g3GA5yVEUU6eFTj3tsjHU6nM3LgBpSsYSelbMJ+Is+BsCVGEVrVee637iu6U5Ks5N+s38NMQBQVLMxl4EwoKtitd/CSht5Q+mqoq/M9//Ss+Pzwlzwtezed0HYdFGLLfbDAPpNxWqzk8enxMGCWoqkK7XsVQVXquy81mkyzLybMc3w8xDY3TizHb2x3iOKHdrqFpKu+9c4fdrQ4P7u7jrwPYsDl73kPZciQyzQvYqwroq0pZ+lShbshjQxWQLwrp3QN4Kdv7G8QXLs+ennL/7h7PD89RDJWL4UyOoSjUqg7ddoMwjCjchGwhVsbkBck6xDQMeoMJfhCxnLtEobDwkzjBMHTSNMN1ffq9CVmW8eXRGTtbbVq2TX89Jkgj3Djm0/6hBCQt8zpgCTPIC7KBB8NA7iVIyRcR/9XP/g/8RALR/nrGn756SuCHDAZT/ubPP0VRFM4vxkzmK46OLpgt14RRjKZpdDdbhGmKF8V852CPbqVC1xH1OL2UM+7aMt+vKmpJyPv2fOd12FpVE0NTmboRv/fWJotVSBxnpGlOb7TGMXWSLCcIEoIgwbRN3KXHx3/xFcPBiiBImc0C8VYPEg4OGsznAVmaEfohX3wsanKWbdLtSuKWJBkHe3WiKEPT1NKtrkYaBjKyBlf9+urBTZxOm60H9wRkVY1sNpTMXTMEBJ2GALuiCiGvyEtwD2Wyx51cP448AXvDBH9B/2SAuXuL1dmpBBKLoQjumBXhI2mG/CiqVAoUVY5jVeScSQS1Nlw8Yz1boigKkS8gGwXR1U8SJfRORrQ2Wuzf6DJeBNhlJv1iUGq/1Lpyj0YpZgYQ+fI3aDpXz//Jzw6ZLgJm64ilH3NytmQ1d/FWHhdfPyFYB+T+GlVTWY5n5EkCwQqjXqfWqqHrKotFyP5+g42OBCJBkNBsilLl3e066yChU7NoNGy+ZYz+/wboKxVIElI3xP/Tv5PMcB1ivX1bGPjDoZTsb90mHS/Q6jbB0ZjKGzvkXoy6uy2iOkVBfjEQElcYkvkR87FHHCbYdRvz1jbKW2+TfvqVVA/OzkDXUU2NzItQqyZv32owPV+iayqToxmaY+CuE4Igw10nqChsGgYKYKkqLV0jzAtmSYaKAHBeFMRFQS8W0Rs/K4jzAg1EJQ8I84K/Xnr0opTPJi6uG5PkUto3FIW9pkW3aXEy9smygl4YoSkKFU0lzgtqmkZOgZtlPPZDjsKI/apFGGVohkb9Vgd37JFlOdNpyPDVjPBsShbExBczGVnsbF6RF7Wazb/z49t8v1bhlqVjXgUoBVFR4OcFVVXhnYpFXVN59XgCB7fly+37qKbGV0+mXIQxfp7x+fGCjbZNyzJYpRlLL0HTFKazkGTuoVYdES8KAmkp2Da0WihvviM8AdtG22iTzVYUWU56OviHr1r/iFvTbBLnMdNgycPJE+k/JwG71e0rVby92hZ3223clYduaAyHc37wnQeEQcxeo87NhpToB9MFaZ4TlUp4vf4ExzapOBY3trq80e3yxck5jmHwfDq9EnYxTR2nYtNsVBmN5rSaNU7PR1i2yWKxJk1TJpMFhBlauXAC0LZEpOZ8LaCf5AL8biLjdQ1TzG5UBdJSOc/RIckYPuxBktN/dsF04ZKnOcU8Is9zdrY7tNo1lss1lqHjTdYyqudIm0k1NTA1ojjGu1gRny7pbDSZzFcURUGjWWM2W2GZBheDCaPpAr9cNOfLNfc7G9xo7OLoFrauY+s6/+E//T3e+u49aTNYOqVEJSSZAH/NgE0HagZnZyPe3rhJ3awSpin7zQZPn58yGi9AU3h5dMHulnAgVEsnCWPSLGM6XxKFIht80GqyiiIWofgT3Gm1+N2b7/D9rfdZJx51o05ajta5pcb+67w1bAHW5Trif/3bU0xTJ0lyvv/2FnleEKWZaLG3HYK1APjiYsT2vRusZiu63Qp7ew3Rq1mJ3G1RIBn6aimMcENnd7eO4+gcH01RFIWjkwVJkhFHQtrL8xwUVSxgG1V818e0TKJSrnl8MYbIR93YF9BLIwHvWhfikoOUxvJ8HJRgXpac7br08a3qdTBw9khc7gZHxEEZ0BaX7eFtCR7cqVQJlkMBe9ORn2pbAH4thktcPEe9/31IIpyaQ3OjSZqkqJqKO3dZDifEUYyu63grj2rVYLdbEb16Tdokv/Ovf8jN7753pbiHpss9XN6nXUft7kK1ReAF3Nqp06qaBGHKxkaFde8cZn1IY+LeIU63W/5tR9La0AyS2ZgkkkC3UjHw/YT+cI1parxxp8P+Zo0bG1XmXsyd7TphkmEYKqeTbyaVfjvQl734Ik6pvLFLHiQyFrdYYN/sCgO7UqN4+QL9zgGZG2Lf3CBbR1S/e0dkcddr0udHJNM10cmYpDdGr9ts3N+geqtL5e092Nqi+PIL9LtlFtntEvdnhEcTtJrN6nzBo5MVRVkp1lSFed+l4uhYlspwHFDXNE6jiDdqDjkFVVVjw9Cpayq2qjJJMi7iDAW4Yxv4eU5aFGybOg1dZZlKn72iKqSlet4izfnZyOV5EDNJM2apEFvCMMPRVObrmE1DqgcP9utUNRVHlRZAWkC7DDYmQcLEjZlPQ/I4pb4p3tzLVUx/HvLl0ymnz2fiDzAayUhdGIDjkAcx9p1NNtsWDV3FKa+voiroisKbjsFdxyDKczRF4f3//N+WdspqQdYb4j8fsM4yojwnyAvSoqDbtTn2QjplVK6pCkmSE/fmREdD2NkhP7+AyQRevIDlkuLFU4r5BKXRhEYDbX8bc6tJdD77f7Ju/aNtQRqgoBCmERtOWyL5cm7+UrktSENOl0veu7XPaunR6dQ57U/YO9hkGUlW+rI3JE0zXvaGjOYrNrc73Lsjeuu3b+3IaNdwyFa3Rd00udtu0+9NGI+ERdvrT5iey4LjlwS28VgYt5qmMV950jdfhrQPOhRBSsW2YLsigG+o1yN2jiblei+RTLlZZvd+KsHAr8/T5jB/MYa+B3FOQUGcpsRximEahHGC3rBRTY179w6uxFhYRBRZIedydOZLl+VijVfOJne7TRRFwfMjFiuPl0c9Ts6Gsth7LiN/yiJcUzNNTpZLdmo1LMsAU70GeU2R1kTFkH2p+Pfv/d4PaVp1gjRk6Hm8uBiyXgeyyEc5aV9K+OHCR9c0KAoqtkUUpyzma0bDOU3L4mQ6Z7he82lfhHx+dvaEn/V+RUV3CFKfnIKGWaO37v9/8M38h29JllMUBRsbFRaLkCzLmbkRBzt14lTGq5bLkINbG6RJSnWjTZqk7N3aIopSlsuQw5cjojDi+NWY1Sqk0WnQ3N/FrtjsHHTJspx+32V7p0GlYrC7W2d0MSXwAgzLYHoxBl9G8RRFCG15nlPkhZTB01RIdYMjycYVVVTyLgFcN0tFu7K/DleEOlRNwD10r5Xz4Nrydngoo3EggB7HcoxwLUC5cx9iH6W9fZ3VL/qyT2NRdF1LWyH0Q9IklX69psnzoUd4foQ3HKAb0qNP0pz5OkZXFabzgIZjYJq6BCdpLHuncR1gNDbI/TUshtx9sE3F0gljUSJ88Wwo952EV/egKArpxSupDBTFlbWvv/YZD+ZUqwaT8ZowTJlOA16dLhgvA16cLwiilLOpJyS9isFw+M1Szt9asyqmU5RSAIfdXSqWRb50ScYSAVu1migg7e9TPHtGHqesjqdUN2sc//FDbv+rb5GOFzIrjmSnWt0mW0doNRu9XSOduqjhK9S33kBptigOX5F8/QJzu0VwOuXk5Yzjdch3bwjpy7REha7hGBQNkf5blKNy26YYNSR5wXES4agqbV3nPE6wVYW2rnIepVQ09Qooe1GCqSroZdljmGRUSqe7XpTS1IXYt8xyGprK07mPosCOYRDmBUmRsWkaPDlfsVO1mPixZPdFjqkqbBgakyShoql8uFenyHIm5yvyvMDQFYIop6FrnK9D2r0lxtYUrV5HabVQbId06ZMHMeN5xHdqDi/8kPuOzUUUY6YZ24aOpoCfpXy4WUepNygefQGDAXmSMRn5LLKMQZxx3zYIi4K/fDxm1zQ4CSN+85ZIwhqG9P8ef3TGu8sAtWri1Kso3/8hxcvn4Dgo1ZpYCRsGyWGPbB0S+a93f3OViEOaoek4uo2uylfeSzwKCmpGja1KlfudDh+dnhGFMdOFy2a3yaefPeMnv/Euh/2xKNZlIoBjWQarpScz3fUqs+UaL4x4d2cbVVF4Np3y5OSCrZ0Oh696fPbVC3I/Yff2NqqqkmUZK9dHURS6LZUgikmCCNYpSsciihIwNPyzhWTopgqzSMC+YUqpu6qXctRIdl81pBwOsnd0WTjiTLL9AlBy8jRndjiW11uasOyTnK2tNie9EVvdJheDqbx/EcleVymmIVQNDg62yPOck1J21rYMEi8iU1UWwyWLdoPnlSlVU/rfuqoymS75Gjg9H3HzwT6nFyN2NzvMV2vCmUd1o0aWF4RDl9pek6Zd4eH4BSPPw40ihsO53Mc8Qt2vkWsKn/z0C+p3u7iDJc29NrZpUHVs4iTl+HzIYDTjYG+TymaLHx/c4uux2A+7sQR+jl7h5eIILwkwtNe/dH8y8dFUBd9Peetmm+2Ww8pPWPoxSZLTqJq8f6PJYh3x8sVYiGhFQZqkTI9O2X1L5K0txyJYB9SaNVotm+k0Rzd0mk2H+cwTUteGeLv3+y7Tqc/B7S1ePDrB650DoLU2yLOcdb8Ppk2wmpeEO01K2GkMrV2MSoVkXcDFM6iLUBfrmWTBl6S8SvOagR/716X9y03TBQTXM3ltnl2BNqvR9XOKAosBNLcpvJVk84s+NLev5vUxbGkPVNvUW8L9mA0moKgo1QaFv5bqUuCyHIxpbwj3AaDmGEynPp4Xc3E6pnP3NrN+jeZmG9/1SeZTnK1tAi8olfcOUBQ4GrrEcYbvl+Y5cSBBAUCR43/9EerdD8kvXgrXoNpGsSzCtQ+TM74Yjti4e4ssy2m3beZzcSLUdY1GxaRmG5yO16RpRhB8s/jTt/vR9xcQx+RxinL7HulgRubHmN9/F2OzTvDTn4O7RGm2ifsL1hdLOh/ewB26bGxVeP7Hj9EbDnqrgmJoFFlO5sckExnNylYeqq3LiJmqUvQviJ6dolZMVp8c0u/71Gomv/f9fXRd4azvcXa25nt/+CZ9L2Iw8jmeBdiqgqWo7NUtdF1llWVoioKhKFzEQsZzVIWkKGjoKraq0ItTdkwNVVHIC5in4ltvliX8WZKRFrBMc9JCkpBnQUyQ5wzijOdBxDhJWKQZ8yQlyHMu1vKlKIoCFYWKqqGjUNc04rzgxaMxv/jlmbjXAbqh0tA1igLqmlhO9j46FqJbFFEM+pibDbyhS8XSeOGHtHWddZaxaxp8UHWoqCrrLOeWbbF7owGA0u5CrUZ0PmO6iGhpGjumxmkk1wtwEkbsmSbPei5hmGJZGpW9Fh/+zm2GfXG9C573CP/oT0lP+tKjrzXExvbB2xj7GxhdEdd4nbdV7BJlMWmesVfdI8oiZuGcO407VPUKfW9InCXUDId+b8J04fL2W7e4GEzptup8/PkznIpV+qIrJHGCu/Lw/ZAkTpgt1yiqQt2xceOYF7MZg/4UVVU4PuozX67ZaDX48AdvousaZ6cDhpMFv/nDt0nckMFkznK5RjE1jM0qVccWfqMbQ91AbZiS/ZZGNmiKZPiOLs/XTSm7RxlkuZTCzVIP30+vQR7k3wNf2PpBCpOQwk1gHjGeL0kXARe9Mc1mTYKIqiHWuJoio31ewqMnR/zqk6eEYYxtGjKd0K5SIDa6iqJw+KpH06qgAKsootGscXTUp1WvcnrSl0oF0KpXObi3i2HohOsAY6PKZqeJisLd1gEVwyCKEsazJXrVgk2H/HwNXgqWhnsyx96osVp7RElCu12nXq/wkx+9T5KI0UevN+aPvvyK5ycX+ElC06owCWdYmsVWpctmpY2mfLPQyOuy9aceYZIRxyk/ut1gMA9YBwn/wW/s0W3aDCceF4sQ29SZ9mVMeevmLtHwArKE+XguTH1Dx7AMNE1hNvPxy4BzsfBRNbU0FE0Zjz2CdcBq7nJxVo6BdjbZefdtqo0q2VQy5Tvv3hUgXc/BdwV0TQc0TWxoL55Ce0+y+ciTfbVdZr+bso/8cuSuEBAvNezJs2uCnqZfZ+mGLaCuGTDryTFW4+tz+EvZV8qRO0WRioLpyHsWA2b9MbNPfy4233EolQhFgUoDWtvYrQa94yGOpUubdRFgWTpnhwNCP2R2eIxmGDJ5UKuwcf+u/EetpnLeUnPjznYdy9IxDA1vIkQ7gtX1fdY65JOegHwaw2okSoPNOjd+9GP0ukjJz0YLjg5nnLzqM5sFNGsmSZZTtXX2N6p0mo7wa75h+1agT6ZrIdQB3HsHtWLiH43JXx6iNJvonZoA0sMvMTbqtN/fJzyVRS5Pc+799h2wLBTTJDqfoWgqRZqh1UWpLfNj1o8vyMNEJFg1Db3pkK0C0jTn4KCG42hEq5DhKGCzbXHvzQ6f/vQZXdNAUxQOGhYdR2YbNzccdrYr+HlBTVUJyvK8Bldz8qYipW8VBQVoaAqqAk1dRQEcVaGpq0RFUVYZ5cPTFSH0LdKceZqxLtn2izTnPJJxPTfLmCQpVU3DVhWSIqeiyRylpigcBxHHYUKaFdRqBpWKgamp7G451Cyd/tCns+GQPnouwOp5RP0Fg6HP4TpEAeZpSj9OeBaEPPYD5lnKpqFzd69cnPOc9G9+RvbqhGfP5yzTjHWeM0sE4HcMnTjPsVSVZ0HI7Y0Kk0WE7yeotkEeJtz6yR3iwRLn3g6KqsgopOPAcg6TIQwvyGZL0oWPtfnN1oivwzYPV2SF3PteRRwGJ8GCntdDU3XqZpUz94Jzd87WTodbN7YZ9qdUHekbfvjBfaq2RcU2iSMZuzNMg1o5DoMCpycD5iuPJMu42WhIaQ+wTJ07t3ZwHIvhcM5itaaz2eLtN27yi4++Bken6ljs725gmSaKAndu7dJq1CCWEnzulv4HhioAHWZSrs/L0jfIY0uDso97RdJTFQH5f9kCEJXZv59CAcUokOMAy4u5BAtA5sWYDVt667rKeriCkYzV1esVGrXKlXqebRkcnQ2pVR3+xYvn1M0KfpLgrX3WXsDRifA5fD+kfzRkcDLi/GjAYrHGqtp88O4duhtNGlaVP3n1JWPP4/mLM8KlT+rH0rYwVAl0CqBpEJ4v2e62WczXzOcuG5stLs5HvP/ePaIood1pEEUJG5stbjQafHfrXVRFwU1cgjRkEbmY/z/I6OfzgFVZPfvhboeKpXN2tuT/fDjmzlaNg506ZxOPs96S9lab5u4my+kSvbON0dlka6+LaeqYplb206HIC3RDJ0sz0iRleDZktQxQFIXt7RpayTFJE9G2T6OYSX/CajBC39yjub/H0cNX0N6BegersyEgrOrY9SrNblMANs8kGLBr8nhyKs+pmoCdUy9BWBfynOmU5XBdgF3VBPDNUlisyOU9RX7drzcdyfDDtWTNcSDHNisCvJXWddncacDoSEC/KEA30G0L0phKvQLegnAkxjmPnwzZbIksrbsKxP62bBlk6xXuyRHLV8+ZnA8JliuU5gZWdxPLsbi/3+QXn19werrg/PHLK1EgrHLc0K5fX3u4xjq4B6ZDMuqRJinj/pSDu+IQqGpSCWx2mjiOQd0x8MKEL19NmK5CpovgikT5L9u+FeiLLKeIU7I4hc//lswNaf7WG9KXzTK0iikudDdvomgqWqNKUZok9Hpr8S7XNHAcIehFifjbxymZG5CtQ1ZuLE53jiP96RJYiwIUXaO6USUIpDe+WMYcv5yTFgXtlsVFFLPRdag3THaaFl8eLfjoaM4NyyDIc1q6RsfQ2DGlzA6SmZuKcsWyX5eEvI6uUQC2qjBP8/J5WSMrmgQHl4x6S1GwS0KcqSosUgH9IM+JC+mD5wUYikqUF2RFgaaIOltbV5mHUmLJMxm38LyEIM6I8oLDV0ui0yn50+cU7hr7RgdNVWhoGlVNIy6KUs1Ptqams0wzPD+l+pN3yT79jMd/9oL+Z2eMkgQvl75eWL5nlmZERcEszTiwDP7uYkGrbsrYx/Mheqsq/gINB/fTQ5nzv5Q7dqqwuUPhexRxhrnTxL6z/Q9ftf4RtyAJibMYPwkZh0OyIuN2Y5+BN6ZttXBjj+3qBr9z430sy8B2LPK8QNc1hhNReMuLAlvXaTSrJEnKfLoSQRkvYO36uF6AaRlYmsar2Zx2p4GmaeiGqDU6joXr+QRhTBwnHJ8OIC+4e2MHzw2oVR12N9t0Ww0ePjmi96iHuX+5CKhgqFRbFcnYL7P6AtnbZTYal7+7BPUkL/+IuH5eUcrF7teChaKQTD8vH1+W/6P8Sls/XkdyvLgk/NVNVisP1/WJolj4Jmsfr+Qe9AYTZtMlf3t+jK3r7G920HUNyzGFbOincl2qAgqYlkEUxsxnLr917zYDb8bHXzznq+cnzHoz0Q8oCqlCFIi2QJpLO6NrMzgasr3VplZ1ePT4iFa7TlEUGIbObLJkc0vaU14iIlqbziZuLO3HmlFhp7L5j/Jd/H+zFYUwrtM058IN8KOU99/a5PTCxYtSjs6WHHSrPLjTodaoQCFs8jRJSSYDYWWrCratU6lVSNOMJE7Is5zQDwm9EBYj6g0H29YYDNZUajaVWgXbsdFNneZmm9TzochJV0sxsfEXNLotCD10Q6e1v4NerRD2Tlg++ar0nxfFOry5gNxluT1cyz6JSpJdIYS0LC0NcVTZ66YAdxJd7y9fUxTynDeX12VJSQK8BNVKSewqSXNmRcR7ikKuQzPQTFPuKw7wJ5Ornn40GRIFEU9eTum2HW7cFM6DrIe2BBKxfB5MzzGrVYrxGdHK5V/7g3eo2QYnT44Y96fgLYQUqChyz5En17GeSdBR5ETnr4TP0NoiCiIa7cbVWKuiKDgVC8s2UFWFMM7o1Cw8T8SJalWTdvOb5+i/FeiV0h0nLyD5+S/RWxXQNBl7O7wgPB6LylBnA+Xtt4h7U8ytBk7NJEkLZi8nuL94TDFfkK0j0rnPaBxgbDXwJh6rixVZWpDOfWF5myZazaHIcuq3OuhNh9HJEtsSC8VqVefGQQ03yzkceTzoVBiNfE6GHufLkCd+xNvdKkleYCgK51HCaZjyLIho6SppOYGkK7BvaigKuFmOrigs0oz7jjDd06KgpavUNOWKjW+VIN3SVSqaSo4EBSDHi8tAIi1EUjcuCtqmLvypTIKfpBxFc7OMStPBDxLcOCUIMqqWRkHBMk75xcMhk89OSRc+vV+dEic5e00xU7AUBb+sTqyzgmmachGn/IveApIEdaODoas876+ZJhnjJCUqRV3SQgx+LEVBRSYNfrDVoL8IOe952FWTIs/RDnYpshyjW0NxbAp3DefnFBfiAa1s7aB/7z2ShS8a/K/xppT9PkszeTJ/Wo7WWSR5ytP5CwB0RUNTNG63WvR7Y+qNCoau4Xshn33xnOFoztT1CIKI1dKjN5jQ3Wiy9kIuBlMcyxQ53CyjW5X+ZpZl1OsV8bSerajYFrZlUBSF+Lf7KUfnA27c2Ga58jg6H9I/H8MooPtgi3gdCfs9zMBNxMimZQlA6yVgG6qA3yWohxls2NdkPF29ZuKryE9pgYutSeCQ5pKtq4ocOyiB3lAhzalV7GuQbZa9xTgDL+HgxpYQ+5IEVVGwLZMkiFivA549OubJ0xPmQcD5aEat5vDBu3fJ8vzyD0au208lkFhEHH78ijc7N4nSFFVVePHsrGxJFNckwyyX87dM4RkUsHt3h+HxiFeHPbY326Rpxq1uG8s20XSNlmOjaxrHiwX/y7O/4evJU6IsxtFtkjzFT7/Zx/t12S7lKqIo5Z992qPuGDiWLp/TxQpVVWhUDDo1sT0Ng5DWpgAwWcrXP/+CYX+B64rQkDt3mb98QaVeIY5ifNdH7ewwn7pomkqn46CX36NLURl37mLUqmDLSFu9VQfDZnV2SmVLAv7FWY90OoRghXnjgQC707gG6EvgVpTrUrrpyO9V7fqnviFArGrX/vWXbYFK8zq7v+zhV9vyGt28muO/Ivil8TXDP43+frVgdi6Vh9J/gqRkz6tC8lw9+YLe8YCvvrogjjPuvXXAD3785nWf3Z3KsRWF+OJYzuVO+CfvbdCf+xCHRL1jCQjyTAKSOChL9z509uV+qm207Vswu4DVlGqzSuAF2LaOZRsEXkClYuK5AZ4XM1mGzNYRN/cbKApkecH8WyRwv7VmZe62MH70Q8zTP5HPYO6hrAIyP0Kr2zg//oDs5TGaaUKvh6JrqBWTi3OXbtdm7SV039khW0cUWY5iqOzeaTH4+Jj2bp1532Xvu/sYB1tk5wMUU0fRVKx37oga3pcn7L67zeEnPcI4w7Y1/uLJmK6uM0tT7jlVHs3dcty44C1HIpwgzzmJUhqaZN5pUTBJclq6mNtEeUFL13gRxHQN6Z/nwDzNCPOCbUMjLuSYAujyhV9nBQ3tmrhnKlzNzFc06ZV3dQ1TkTZAP4zx85yqqpJTkJatg6Qo+OLhCFMV6dBqVWQRNzSFhZegKRrPz11qfZ93fvsWu5rK2ec97jYcHi085mWfvVme88AyeG+vjvfJC6y9NlGcMU1T0qJAVxRp5WrXMV1SFEI2jBO6iUkOaJqCVrM4+dUZ2yeT0twlJxi5JHFG81YH+9YtsByKi1OyR09FY33h82sDYa/dtuG0uN+8x8ifkeQJSZYC4iznxh6/sf1djt0TTlcDLlyXrR3JPl+8OqfTFWe5jc0WvhdKdl6x6LYbPH5yTKdZQw0U9vc3ubXVZbByxXjFNnnzxi5j38d1ffb2Nnj09ATbMkiznCefvURtWuTLCHVPYXA8EmCNMvTdGmkqbC063QAAIABJREFU7PN8thYwszXJZAe+PNYVAcDtimS1WgnSCrBOBQhLDfzL+XR0VRT2shzyMlDQ1fI1AOUxDFX2qWT0aze4bg0YJSs6K0At+OXHj6lUbIq0wG6awk1RFHzXhwL64xn/+1/8ih/95rt8cGufj58d8faDmzx5dnJNLtTkWuzdBg/u7vPTw68wNJlbvgJ5uNYP0MrKRElcLEYBq5onzH1NIc9zDo/7TKcrDF0jCGM+W3ksVmtu7G9x0GiwV9uS6YM8xlB1vOT1B/qdnTpv7jUJwww/Sll6MeswwfdjajWTH76zxYv+irwoODqa0+zI6BhFfjVLbjkWUVlNbHQaZNk+s+EMRVVwag4bOx22tqpcXLiYpkalYlCriRHX4GJOa7PF7OUr7O1dwvmC2eOvyuNX0Q2d1cmRXGwSwcYt4kWpUpdEAmatHclkk0iAOY3l/UZVsttSEU9Gg6Wa9veyf730S8kzAe1SFe/q95cVK6ssYdc3BFSL4vr9ZkXOo1vSK09CZl9/LiY3uglOSTCvtcFbgl0jnQ5ZFdDu1rh9o8lg7LFz/yazYY04z+SadRMqDUzb5Oa9Xf6bn76S6RXTlnNDSSJUr68PrkmG4ZpsVlxxEhRFwR1Neb5cUyQxuu1w+OycNAiIozatlo1laHhhSpLlaKqCZX0znH9rRp/OZS5PMTT0bp0iTEhnQtRKRivihy/ErjZN4d49YdR7EXsHdcbjgHrNYPViRLrwUMu+ZbgIaB80QVPZeGub2ZMB4z9/iFZ3UO/dQXlwv+yBZKiOwVc/PyFOckxd5eXU537NYZ1nVDWVz3srenFCVVVZ5zm3uxU+dX2CPKdTRqMdQyMvJOOuaxoKMEulzO6UGXlcFDRK0NQVrrL1HVO/ChRqmkJFlcDhUrQmhyvQX5eVgQyI8pyoyHGz0juaS3KeiqqApigss5RlKoS/hZtgmRobXYf9rQpNW8dSVBxH49HPjsn8iN07LWxbw1JVvl+z+f1mlX/6YJN//ye3eX+/ge+nVB5sk8zX9Eop07qmEubSmmjqpdRvXqApikxyAI8WHvstG8fROXs6YfdGA9UyWHspYZhhtys0DlpYNzYo1muKk1egKGh3bmIebMhI4Gu8BWmEqqg4uoWj2QRpxCxc0nGarCKP0/UZpmpys7HD93b3KfKCJE7ptBqsvYCNdoOToz5RFNNoVEnTDF3TuHmwhe1Y3Lm7R6835qMvn2OZBj+4ecD9TodZEJAkKXmW88UXLyiKgjhJCYcue/d2yKeiVHfyyRH4KXrFhDin06yxfDku/R8MAdxyvh1HF6Gcy/77KpYyfI68rmYIqF8CvK0LuF+CfGlrK+NtZbleK62G0+Lvj+Xp5TjfsizbK8ixQa5BF8KfP/cgyQjDGMsw2Ow2aW80MRs2mqphmTofffyIJ/3hVUkdVcG63WL3nX3e/8nb/MEf/saV3O5OrUZvMmd4PJLru7zmIL3mKQCYKsUqhpaJN17T7tS5sbvJ6fmI3a0OpiESv14Qsr+/yZ1bu9zY38TWdT7uPyYvcmpGjZbVYB6+3sZMANOpz5tbDpqmYGgqSZKxXsdsblaZTn0en8y5v9Pg9ladd9/aRNM1nKqNWhMyV3NvV/TcFQXLsRidj1BVle5OF9ux2dzt0D8Z8vCzYzodh9s3W2xtVMnzgvU6JkszZsdnUG0SjkfCaO8eXPWbV1//SoCs1i7H4yLpmWvGdUZ/2T+/krot+V9JeK2QV+TX4jeX2fyl8A1cZ/VJJD1/kDK6olwHAiBZ/eWIXuxfBwGX71cUef/l9a3nZXthCYqKqutQbUpvP41Jo4hXH33GZ5+fM+gvyfOceD4Fu4ba3qJ77w7v/+AerY0WF6djLEtjNvNhdHw9hncpsGPY1wTDNJZrNWwJPAwLunusRxPsVgPd0DGrVdIkZftgi8Zml0a7TpYVnA1dvDChbhtYhsZ6/c1r8bcCvbFRF394BflgVAXnt95Db1fF2cfQyI5OwfdRdB21UaMIE9bLiFu36qRZwXIV8/LLIcHplGwdYVUtVMuALOfiywvWXkKlacODByK7enEBScL64RnPvhhi6SpRmnPiR+w5JlmWs2kYxHnBNEnZMDQyCt6s2HwyXLFMZSXo6BqLVIB7mmZ0Sj96XVH4TtUmyAvi4rr8vkjzqw9DSvwFoyS7EtWJyxn0UZKTU2qUlGBZ01RMRTL9sOzTx7lkN5fHc7MMW71uBWSFBAQ5BYYqgjWmqVJxdJpNk822RRTJJMDDX/Z49WhCrWbwg5stbFVlmKR8cbbkpx+d0ht63P3RbY7+6iW//PicSZnxm6rKtqFhqgpeJp/Lnqmhldr/H27WsVSV/iLE0FVeLXzGFy6T/ppaVae7X6dIJauPhwuxEW62xMnuvEd0PCQevd6L5G51Ey/xSPIURVGxdIM3WndpmHXCNLoygjBVk67dpF6vsF4HjGdLbuxukucFveGU0/MRk/ECVVUlc1VVsizn4deHrNY+7XadNzodeq7L88mUOM04OR7w/OExpiNs9GjgoncrzJZr6geta6BtCkfi4L0DRoej6xK6olyT5uaRAH4sj439hgBwqWQHiJBOWpa4FaX8Pdf9z8ssvnS/k7/p8oOyNXkvlLr6ZcBQ0eVxmEGYoTi6BBDpNSdAq5gkacZ0IUY5lmGwu9XBsWXhjYOYX378mI9+9Yjl0uONO/s4tkn/YszDz17w53/5KafnQ37rO2/wl58/4ZMvnsn5glTur2mWQkD59chgWfq/eWsX0pz5cs186eIPVmRZxnAyJ45TNjtNXNfHWwcMRnNWUcQf3v4JhmqQ5AnL2CXOXu/2E0C77dBfice8oig0aha//8EOFVtGilVV4WgkvIOFK8JI8/GCPBSQTOKEYDRkNpyxmq2ot+rkeS6jYXHC6cPnpPMx3Z0ujYbFYLSm13eZTj1moznrV0/lQi5H35rbkGdUD24KoF/23osC694H0mu/zLyLXED+UhkvjWRv16G1K7+Pf62qshrJc5dl96wMMC9Z+FkiIAllAOHKsez6tfhOtS2gmaUlV6Uk9YFUC66IeKacK1hJKT32wV+SZ7mAvWHKdYei1Dd/9AXTJ48ZnY/QG8L9yMfnTB9+zsM/+znjizGdrRYPPztm+PywvIay+tDcls/osqUQ+fIYoN6Ra1kL/40iJzx9SRKExGGM5Vj464CiKPDXAYtFyNu3OtQcmcUfz3z8bxl1/lag1+s2zOdX5bM8TmE+J49SrIqB8YMPSJcB+WhCMRxSeD6pG9K+02HtJpwPPVarmDARqUbVMsjTDEVVSFYhWV6we6NJ9d/6bTnheCykrzwnTzJqNQPL0hgnCffqNss45SyMeeqH2Kr0yfdNEz/L6YUxvSilras0NE1IaMBFlHLXNlhnORpwHqVM0xQ/y9k1NUxFIciFVCfZu5TkAfbKjL5jyBieqSrl/L0ECBVVkYy5BIvLNqiQ+qRNsMxyYb6XLH0NBb80/tEVhawoSPKCZZoRBClWTQxYajWDnW2HZtWg0zSxbY3pLMTzE944qPO7b3b57bc3+Dd//z6tmsn00QUfD5ZMkhRdAS+TIGGYZPglyLd0tbwHEfYZLkPiosDLc3w/4V6rQpoVTOYhy2VMuAg4fDolS3OMVkWMipIY4hjlxgHWzU0U/fUeTUqLlDiP0RQNQ9UpioILb0CURdi6xb3mXdIiZRYu0VUN1/VZLte8ef+AhevRPx6SpCnuOsB2TDRNRSvHkFYrD8PQOdjb5N/93nepmiaLMMQ25DxFUaA1LBzbIpl4HLxzQF7khOM1bm+BUpPFprnRIA5izo8GAmCODo1yIVIVmZPv2gL+hgrziORSBatWlvbTsnTfMK8Z+boiJXel3Gcl0Nua/N4oe/hxVmb6ZWk8LeRLnJWklkshnjSnCFIR6pEPF3SRRlVVhcSLSNKMzc0WWZZTdWya9SqNVo1KxRYuw2LFeX9Cxba4dXuXt96/w2/+8B1MXeewP+bV370gnwRyvSDnDiTIuCIgtiypKlR0LoZTuW4voWJb1PaahFFCFKdM5tK7fnnUIwhjWu0699sbfDV5wrk7IEhDTNWgYnwziel12RoVg/5SSroAQZTycuiRFwW6rvJPPtxlPPWZrkJcNyb0QvzZgubuJqga/lisYNPFBMMyiEsSJQrSaskzug/uc3DQIE1lPdA0hXrdJvRD2DhAbzTBnVLf3xeAXQzxXj6G7k3Juu0KRD7RYi5gbtdLtnsO1ZYA7CX7XVGuzGcAAdNLRbvGlgBxEl3r55sVAU1Nvwbzy+2yp+5Oypn8irx/PStbA/bfl9bNM7mW2L8W3DEr16V9RexknZojv3fKCoRh/9rkwDnpYlK2IVoY+/doPniLwp0zOBuRHD+W4OFyuiBwr8V+LomDl4GQ6Qi4K6p8DiA2ve1dWE1g3ifLMuaHh7gLF8uxuHu7zcXMZ1HO+VerJvq3rMXfPl4388CyUKsmyVj+Q9xPXqG3qhidKpyciP59msncoG1R/d498jARNn1RME9TmjWD8STg1bMpkZ+QrgJUXWX3VgvnwTacnMiN1uuQJPiPzgmClOkq4ngR8MF+g1mQ4Gc5pqKQA4MkoV6W26uaxjrP2TY13q7YmKoA6B3bKAW4pNzu5WI+E+YyG98oWewaksFXNCHgNXSNlq4xSVLUMukxFaiUo3d5uf7FRcGGoaGXLH64zvI1BZZlZu3n0isP85xllpX3UFytpTnCyj+98CiynCTOrtiWuztVmg2Tdstia6tCs2FSqRhYm3UWy4jF0ZRazeDx+YpJIu2DuOQNpEXBnqnT0lVWWYGtqqWpT8q2YTBMUh5Ubbq6zjxIma5jBsuQVkMysfkiYmvTxqlZxIOlEPLCEEyT9PkR2WLNcv7NBJDXadNUlVXsEqQRY3+GrupsVzsM/CFRFtOwqkRZzM3NDv/K998hSTKm0yV2p0qe5my0G4zHC856I8IowfNCGo0qm90mP3jzDg/HfdI8p2lZJHlO72yEH0ZkUcJysWb73jYXoym5nwiIJTnFLIKmiev5KGWPnqqOtV3DrNsCbC3rutyuULLrf210rhS+wVSvGVtV43r2PcmlLx9m1+CvII8vZ+1r5TEue/1pSZhztOuqwWVvPC2ug5GyBaDqKlEQY9VtesMJcZwQx6L/X6s6bHaa1KsOrUaNjXaDemma4jgWRV4wHs1pNqp89tULOdclCbFtCYg3zFIfP70OThYxetUiXYW099tY7Qqj2ZKiKOgNRZrY0DXWXki7UaNecxgNZ/xd75Rl6HKrsQ9ATs7Ae73VHQGWXkzV0rEsjTSTrP7p4YyKpdNq2QzXCauVGNtomvTcv/OTd4jDGOZ9AThFQf+/qHvTWNvS9L7rt953zXveZ77zUEPP7vbQHdsRthPbkRPijBg+BDIIEBIoCAmJD4DUUaSAEBIIBb4QCUQUKYoSCwKysSOIjd1xPHSXq4fqqq7pTueeeZ89rnngw/O+e92KXBUTtenKkq7uOefuYe1919nP8/yf/zDZZX4+I09ztNZkibjjeZMdJtM+Z2drqqohMoloT98/pc1z2CypTp+gxnuszq+keGmDCs1PxOJ2IcmEpMZSeLjbFd54bGBrM+HaIlzlUpSHe8btLpUCXGYG2n5hn18Vcr8ilaJoofi66ixore2unfjLTAru5lruY41z2rYr7ttmIpLb9ydUz9+ltUhZUcjuPujJaqJIDSFQpICDwwPKTHzy9WiH6vxYnt8+px9Lo+OKf8R2xaBd+bdoKM3GaE/WBfMznCAQjX9vAsNdtNYwkMCh9XzNu+/PyIqKV26NSYsKrRxmlx9u5fyRhV7HJka2bnCMAYDyXNjfx/Fdmk2Kjn30g7s0xycSqZokLE9XlGWD7yhWdY3nKi7XBbO8oq4bNleJJOGlBdnjS5zPfg4nCHDu3id7+xiU2MOOIo87g5DHZxtmVY2vHPK2ZaQVN3yPV+KQvGm4MMzvWCnOynJLqmuQKRbglchnUTVmEpeCXredVC5WEnhjCXJF02yh9wboaSnoedMa2F6CZUIlHvc91b2VVgJn3+DrqiYxqXmeKfinRc2mke9tnO5JUfLssUwh1/OcKJLJMEkqgsgl6MmFogch3k6f8Sjg4jJjNss5zkvuhR4jV9PTip7WaPP8ynHY9WRH75hzXtU1O65LZdYb67pGA2Pjta60w/5exGJZoCIP1QvEHS+KoGlwXEW1yhj0P85UPNCOJtABeVXgKTnXyAsZ+gOGfp+yKamaGgeHYdDnM3tHvH91zen5jOGwR7ZOYV2SFQXnx1esDHy22WRSqNqWr7//jC8e3SOrKj61e8Dpifjc53lBbxgT90IurhfiMBma4uspvL0e/ansGdurTAqYr8iXKcU8Feham8JXtTLVF40UcM9M5qUw4GUXb6Z0kGKdminY06Zwu1vSn7w5qtvhN4ZwZ0knpZGvOXSGPI2Z7vsuzDJYlJDLe0dR04tDmrxitUrwPEnCa5pGCNa+RxwF7O6OCU0+wGAQ4yiH2WLFYrmR9cReJCZB1ibXU/IaQQp/2WxtgKtlBpHLcpUQBj5sSjaXazxX2Oi+7xJHAUmW4/kuvV6IpxR9v0fsRuZtqgn+JdDRu66iNp8pgSc57b6vub/fx/c1m7zi8LBP5Gsmk4jRKOTk+ZL06XswPZIi2dRUywVcPqXJMqpK5Iqu56K04vzkmpfuT/F9zeF+j4uzBWEUGtJbBANj72pDaADCAe6Nh2I0M5hKwcpWnQd9YbLoHUcm/CKVpiAQGS/R0EzkJtHOMudtYI1lt2craS68UIpsf2qS6vqdZG1yo9OpV4XczvWloGqvm+btLjwXRQLpslsvhP3t8yTnZwT9Pm4vJh72Tay6wt2/jTuayNqqL3t08pTkakadpiKjC3qyltgSB00zYvf1SneqAksU3IgNL0FMe/4YAB0Zz5mqhqbB8z2UVsSxT+Bp+qGLqxWBp/FD//e6dIB/3kR/vRGY1nGoVymOVqiejzMa4Y1j8pM5qhfQHj9H7e1IARgOGd4YcnQY4zjwA7fGLJYFntlhPzpes1gWVMsUdxyLZ37cpz09JfkHvwhNiwpciqJBaYdFWpI3kigHQjBb1Q0ODpWBzUOlGGi1NbfpGVhfWMBQNA2nZcXQJMD5Vj9vIPe0aYmUojQsdYHp1VZ7HymHTV1vNfLXlUzOoXK2pmN5KyQ3q9O396takebZHXlhNO07rmJZ1VxXFUMj7XCAZ+uMohR/5d9564rffWdOUTYsF8KuvrzMSC43FKcLvvH2Nft7IZuiIjBIRWleS9E0HPgeS4sgtNIILetmKwEE6PU8FrUJ+WkayrKWfXJe47ia3d2I5ekK5Wnwfdr33sOJ5QO6Xme40YdfXB+HwxZyrRRnySUDv4d2NH2vj6995vkSX3koRxFon29enPD9t25w43BHnKaU4ujhIXle0pv2qKuai9mCoigpywrX1ezsjoi8kMeLBX//n/7O1qGqtIiOYewHvVCKVl+86eumpqxqmqqRImthIat716ozyGlbSakLVBduY3fqPQ/yqpvytdMV7cpM9CAF0srnQH77IzPJF0Zq5zidLK8vufXb+5TGfa9BbhNpSCvqRcZwT8iL5A3PT69QSnF8dsn7T89498kJmzQjM0EdWVaQZgV5XvLmO0+ZDPusk8xwEyohKsbCA9DDQLgHkSvnOA7kPVHOVm4XRwGL+Ro1EEKjMkPJap2ilMONgx3m8zVaa24MBjxbXVG1NdpR5FWBrz/ezSrAZlNQ1o2k1V1uONzpMRoF7PVc9kcRedkwGQTbqNKiqNnd6zN48IqwyMM+4Y070DZSmJM5+WxGnqRkSUYYhRzdmnI0jXn+fMXXvvpYzHKqSop8kRmDmraTphlyWVUUoryyfu2jAzPxG+lcU4tznfWwb2p5DHubbC3FsalN4VbSHHiBWQn0X4C0tRTmyyedX74fy58ykyn6RQlbU8t9rZwvGnbe92G/C9WpK/m6P4ZkKedXpORZTnV5SvLsEZw/huWlvF4tMsC6qplfGo18kcq52dhc6DT948NOupdvpMjnSdcwZWsYGZtg7W0bn3q9hKZCu5rhzpjl9ZJ4EOO6iuOTFY8v1hRlzXyV0+8HH3r9fLRhTlmD7+MfjQl+4JM0RUVb1DCe4ty7S5tVOK5m8+3nRvKypnp6RnhrymJZEHrirJTXDU/ygr1BQGI6/O984wIV+fI4V+ewXBJ9/iW8nT6zp3Pu3umTphV7Y4His6bhsqzIm5a7gc/jvODM2Ns+LyrSRibsO0HAk7xkqBVXVcOyEvKc3btb6N+S8EauZs8Q1qzm3b4ph77eytKypiWppRHwHAffDBzWVjcz2v2kaY1jXs3G6N1bIDBF/7IUtMFG4jqOw1VV8W6Wo5DHfHeW4DsODg4tLZtNydPTDU1SsLcX8eTpivX5mtBxWG8qHIyVOXBeVoRKbHfrtlsbNMi5hkpQi7SRdcF8XXBrHNECO4HHcVHy6CrhapFzdZ7gxz47nzqSpq8scX7oD9G++y7lxYp8mXF58vFO/loXKS0Nu9GUh6N7rIoNSZUSuzHTYIKvPFylebo8ISkzHk6mfPXpMTu7I9nLxwFFVZEkGZuTpZi7XIsz3Pn5NXdu7vOp3V3enj1jkxd89uU7DIY9Ti+uOdqfkmQ540lfeHVFCddCmAl2ezSPV+QLE/mZ1lvSa3/Uk2k+kN3zViMfSqocDlLcHSnQXi+A2MMZ+VKsM2ODG3uyy9aGrZ7XAn87pgGwRL7EMtrNBG0vqLIRZn/7gsRNK1gUL5j2yBohSXOKk/V2XfD0/RNC36dKCtq6ZbVOOT67pCwrRsMe8+Wa1XLDsB+z3BgiVqDEqCer8PohRC5N23TnWrfyfsSuaUAq8DRFWbG/N6ZJSvo3R2TrlOurJXVRcXY5p6pq7tw95Hq2JClL/rVXf5r3F085S67YlClJ+fFfP7muZhR7HOz2ePX2mGVasFjkHA48Pn97SGUQyLffm7Fa5eR5xftvn9Ab9mRSBJq6ATegevRNgdUdBxzhWAwnffp9n3dPlqRJzsNXDinzks3JCaPdCfghwXhkpk+PbcKY0nDxmPraqCS8oDPIGUxkB29h+3DQwdbW6Mbkzm/Z79ozxL6xPFfQkwKpdAe1WzmbheDtkS6lsCot55EupYBbol9Td1+XuZyb0tBUcvu2gdULjQLA9ZkgFVbbD3DxmLIoCeIIkiVtWcDo0AT3xHJf66vfth3L3r5f1rI3HnWGO6M9SDed7M+iTI4j5j0Xp6SblKO7RyxnS8JQ85d+6iHzdcFqU5BlFZfniw+9fvSXv/zlD//H3/mFL9O2VOdz2Gxo8hLluxS/+wbz336bcKeHO4pJns+JDoe0SYrenVA8u6LnOXjGojPdVBxEAZebglhpkrzCdxUD3eDpGqffg+mU7De/iQo8wkGACj2couLpZcKqrgFn6zwnTkHyOZM1LbueS922LGqRtCUGtnccGLsaX0mC3eOspGyl6O64mvfzikhL0UuMxn3qSeJcT8v0Gyqxwy1bGLqKFnHKa3HoabXVqq9NHr1vEIEW2ZGnjcDlWSNcAft5Na9F139Z1sTaIVBSjLXjMKsq8zzCB3grzelrjVM1eJ7itcs1/cZhXtS8t8q2jUOkFOdlxYEvaXaxViL3a1tmZcPIVfS12j6/5zjCQ6hg2vc42RTc7gVUdUPsaoJA0x9HqNiHskY1JWoyguGA+slz3Mhjdrpm76/+h3/t9/Nh9b043l5++8uxG7GpEq5zSYuL3ZCf/86vMi9mxF6Ip11m2ZLD3i5lXeLohrPlGo2DVkp2vWkGvqbOKsYHIxarDamRMaYaer7P0WDAV9+WSWgyHqC1pqpq5s/nZgplGytbl7UU7aalKWrUNKQtpUgXaSEXd2P+GOIZjmPgdKdzvUtqGhcp1iuz/49d8YP3HFiawli1UvwjtyPeVWbXbhEEK6ODzinvxZ9by13HXMR5LQ1HUslaYujBqkQPA9qsoqwr3MhHaUV1Iel8CocoCnj++JxGO6yTlOTpgrJt5Bz7LuQNTaSFhOirjhRo1xahK89tnPrqWUbpwe7emOtHl6hhIGZfjoPvuQSBRxQFOEqxynIW9SX3xoeUdcVePOXt2Qk/++BnPrbXMMA/fOP8y3XdcrnM2GSVweNafuOtGb/95gVx7ONqxfOzNa/cnzJf5vSHMWVZU+LReoHo6tdXsg9fXuLtHtDUDXVVU1cNeQlh6DIaR7zx+hP6oz793R25n9Lks0vzf19+kETXNrKXbxsp5k3T/Uy78pzxSGxrt370867Q+ZFxq2u6Xbz9eZlLdHddSYNhm4PeRB7HyvAsGz8adRB9OJAp35LxqrxbD0QDub2jusJcl4bn0sofa95TFfK9VQf0p+LCO+xTHL+Lnh7SJkvx38/WMo378QcbDrs+qMuuWbHICK045xmEwt2/SbMw3v203f38iPHOEEcpNknFrGzQWqTSQaB5761j/tOf+8LveR1/5ESP1hDH1OsM5bvCsFYO3t6QNK1oyxr14B6921OSrz9m/vpT2tWa4DMP0X0TMLMoWBg9ud0N503LN1YJjx6vJNAmjmEp+eZNWeHvDajmCc/PEoqmpW3F4z0yOfNPctnJ349DBoZhX5hpNW3EDCZp2u3E3VNKXOWUw93Q2zrUHfoaz3EYaYXvgG+mfN9xWNUNsVkBCBdKrHGXVcOslKk/qRsGWm0NaTzHoae10dk32/vZnlMBm7pj+PeUFH7fcSgNEpAbRr7Y7Tb82F/4IrFyOCkKZquC154scIBH64xVXTPUakswBNj3XVwHlnXNoqrxHOEkvBT5vJ+VOAjPoTGPH2sJATpe5nzx1SmzVFzO1lVNWTW8++1L6nUGykENeiI3MVKt9HyF7328Wfd1W+Nrn2WxZuD1qZqKvC7Y7/VYFSl5XXCjd8TN/j5LLNobAAAgAElEQVSrYsNVtuBz+/f4sYcP8X0PrRVXc0l483whrc0vlhRlRTlPefT0jN04xteas43AdZ4n+fNpkpGkuUzisSvFfux3Gvim5eZLR9DzaJZFB0lbEtAL8jkVmP26r8WhblNtzXRUYCbcQHcNgqvEPMeuBFxTzO0qIKvkvJKqK6RWO28fp2ox3aecj7XHrZvORMfu9Ef+VhVQ2xStQmRf//HP/QkINO1FRpoXvPXWY8hrlidzWVtMAkEltCN8gtAoBHyzurDkwmkgjU7ddO9NXkPfpThZc3Z+zWd/9JM0TSukx7pFa8Wzk0sSE6/rBx7TqE+gfbSjeG/+jND9+O/oB6HHwShgscgZ9XxqgwYOBj7JJifwNN93e8jRUZ/3ns7ZbHLu3Bqxu9vDC0SCRWGKnCk+5ekTKcaz52xOT7hxY4Dvu6zXBV7g0euHhJHHcrYkXxtLWm1SEQ/ud1r5MiO4/8kuRMYWODDa+mnnP2+nfUcJxG2h+brq5HeOI4XcptJZDTp0kjgbDGNz4e1kbx324nG3NmgbmaBBzsv1O1gfOtmeXR0UqRTrdCkNh2kE/uK/+zNdCI/WrN9/G3oT6oWB7Q8fimJA6c5LwAb0lJncN+iZlUdlJv2sUwYYl7/q4jn+rZfM/fLt8zebFVdn16wXawkMAqZGpXVysqI/6n/o9fPRhX4yEf95wNnfg7oh+uv/BcF/+7/Q63l4e0Oc2/fwX7nDs0dzNptSJHgIYzwvahZpKVKurCD0ZKK8qipu+z7rsub4196BsoQ0Jfr+V1CuJns2459864LjomDoilWt6ziclxWPspJDX7PnuiyKisQY32gwsjGHJ3nFnifBMvOqYdM0PM0rbgceV2XFZdnwrKhwDTxuyWuhI8Q6+3xl23JedMx7K6sbGplaXysSI9vzDVGvReB6azebGwJcYIq754jtrm/+VsBpWXFhCEc1XWNxWVZ88++/huc4BrUQEuC6bniclduJ33MUE1dzUVYsq5q0adj3PHrGIChrGrJGYnM9x2Hqao58j6JtCRzxAAiVIkkqDkcBsVYcjEIcxyEpa2aPr7l890ryCNoGx/cJbu8S7vS4+2e/+Pv+sPpeHJNgzKZMyKuCnXCH62zFn33wZ/jPfug/IXI9POXhOi570Q7fvnrMpiio24Z1kYlDHZAn4sVdrjI810VHnhTqQDT1r337fZTjcLne8MrdI5RWXM+WvP/4lPxsTW8QSWECKb6rUohlQ5/Ti2toWpyB2LkSuZ1mPDbM9qaVRuD5RordeSrFbpZDoAURGPoSXWuT66zszkGeL6+7gh+ozhzHFnXLZrdOeVZ3bz8h7O0t49/+LKnkMWe5PEf5gqNd6FLMEv7er/+W/HxgCk5q1hC5FGzXd6lq4943ywRpKGrCvb68LksExJxX1eKOIzlfkwTJJMD1XWazJeNhD7cXEPdCdsZD6rzk+nrFs+cXIvFL13jK42b/gGHQ48fv/OAf7EX4XTgCXzNPxRp4bxjSti1/5y//EP/g3/kS/UGIqx12ei6ff7jLs0cXMoC2LWHokq4NDG1z36+eyc7Y9aVouT64Pt96/QmepyiKmnsPD0g2OVdnc1icy946iDqf9stnAjP3JjDYlZS8MjOyONNQtE03NYf9bopPlwKD26z49Uz2+mUukjLLTreFuswEFbAwvN2nB3HHlm8b+bkXSMMwP+nMcarC2O6qbjq2u/rSsP6ThTQVizMp1FarnxtovW34hV99d8v014aYx+a6MwKyKgDLorce/r0J7N7+oIGP5RdMbsjzN3XX2PiRZDVM97YGO+5QHAKTy0uqoiLuSeNT1S1feDBlNAr5kz/9yQ+9fj660HsepaXsz+eSN//v/QeAQzQMcXeGslfwPHZ3QnZujSTIZjol/NRdirwmNO5sCodVUXNqGPJ527A78Nn/1KFICVyX9a9/kyYveetrpzzciTnyPdZ1Rx7acTX3Q59d12Ne17y2yQysLzGvSdOQNi23A5eN0a3njRTrvG3Z1DXHheTNT1y1Jc1VLey4LlnbCEzfCOu+p4Tg17Tdjj02ryczBjqekbHlrbDqz4uaynSXtWHmy3M3XBvylAKzy2+32v1YK5JaAnAiJVr7Fvj2KmFjXtezomBe1bhO12QUZjK3qIK1vM3bhsd5wWVZMXFdZlXNnqd5mhe4jsPTvGSshZuwrKU5ePtYJvTd3YjjeUae17x0Z0gUuezen1K8/ZT2yWP5RRiPqVcZq195/ffxMfW9OzzlcZKcSvBKsSCtcv7tX/5rODhoR3MQ70iuuhsyDfs8GB+RVTmvTu/w2VfvkmQ5buAJTOxrfE9TL3OU8Z0f9GPu3T/C15p+GPDaG+9BC9957xlHhzv4+z2SLO+IbiMfdkPxsV+X1M9WMM/F5W3kQ1J2lrXWic6Y1RBo2Y87jsDXvsIJtaTcNa0ExliJXFlLgfS1kOYCY4hTtZ1crqUroFXTueWlxm3PMvRflNxZYp+DIfCZx1NIYwKdRt+sBt577dH2OZbvX8lrqczzBnq7x9/67pvHz5YpzdOV3H7gSzNxsy+kQpDvp6EgGtc51VXC8fNLtNbsToYkScaz00sePLhJLw65d+eQs/Nr3rq64jy5omwqlvmGr1+8+Qd6DX43Dl8r3j9bobXDMikpy5qf/q9+hbppGY9DXj7oMw49PncjZro/Zv9gQF7WBIHLnZduUBdFpwO3k3Oy2Ba9wcEedx8eoRwHpRze+84JfuCxubwiOLxt3ORU52AXDbup28bCWna9Zb3b6TuZd1p1O9Hnm87AxnrAuz6ka9mJ55tuV26d89qmY6p7YQfzWwg/7EtxXV7I1+urzlbXcgrsY2Vree3QQf/WHtfC+doz2n0PFmdcvP5VswoIqJ+/IxN7ayD+sC+RuXa3b/X2dlqfPe8IepZzYOV31ss/Hm8tg8tM/vTvPYRkQbVa0js8IhhPmB5MWc7XPH48p24asrJhPs/47W+dfej185GF3jk4okkL2qqmXiU4WtH/wYc0v/x38PYGOK9+gnZ+jXNwSLQjpgJtVm418Te//xa3b/Xpa0XeykQ51KJRH2jNzbtj/B/+ftrFgvzNx7RVzRtfeYLjwLtXCd9KMt4zRjjKETi7aBueFwVF0/By5LHruoTGAMZ1HMau5jiv6L1Aoksb0crPKtHhH/qy0xc2usDYnuMQOorAkaSgvtHYT70uMU6Zx1OIpW7StFszGns0dIhnoLpGIlQOkZL9fWPe+LEr5xy/YMSTNy01ohTImnZ7nqu6ZlbWBAZRCJXwFGLzupe1yA9tyt5A660Pv03duyprJq4mbRpu+C6ruiZrmq1qYX8QsN6UeK7DxHeZZyVPnq548mRFOU/w9kySXdSjfOcJ0UsH6I+QdHwcjkkwIatyqqYmr3O0o/jpB5/m0foden7EQXzIulwbFr6wrz3tMgmH3BmNeHj/BrePdqVQZhVZLlGpDg5O5HLjcIdP7Oxwul7z3qMTwtDnW99+n6ZsOHn3lOLJgvb5Ri6CwBTvdUljSXYHsejEY49ez8TBhsYBL9SCHGxKYdU3bedpb/Tw7aoUGH5lpgXHQPSpmeqTqnPJs055Vl5nDSHattvJJ1X3tTXOsW56lsTnmPVCaUh/jfm3lm53bx/DkgoD3Z2LSeSj50rR1g6OdjpDIMP89/uBQPkWogdBNYY+1TyFmz1Y5DRlA5HGmQT0BxFpKo6HYRRQlCWnF9e8//SMi4s5w1GPvTimbVteP3+T3WhMz4v+f74q/78f93cCahOOtUwLlFL8yOdv8O3nK6ajkM8e9lgXFXcHPUajUP6rqoYbk5h+P2C0N5WpELpc9+ktKdhlhtKKwcBnucqZXYjb5fGb70G2Jn/8phSx6xMpZE0tBdsWYkeJHG56UwqlHwl0bW1rreVsvjE7bN0l11n2vXWIKzO5bqKhNAyWdGejXaEr2paU19RdRK19rDyRCd+uB+x97X3qstvv26jbIpHbu36n97fnCAblMnv6POmQitr8LrtBt85wg04eaDX1Fsa3z+coue2LHIOtPXAFeUqZl7iHd8AL2MyX5AtJDQyigDj2iXyXt0+WPLw/+RfPo7/87/8u1SIl+vRd9Gc/Rfi5hxQnM2hbvDsHEPWofvcNef+GkUjmdvpiMBDHeNM+jSmMbQu9QKPlI5JQK95964rmm2+Q/sbXaYsKdxxz+1afwcBn7GqOfJdXo4Av9CP2PI9AyT68pxUDrWlbWNSyh87bhme5JLlJcl3Fpm458jWREg9618jf5lXNum5JmmZLgLsoS/P4MiHXbcu8qjkvKoZabWF7sEx2KfrzSlCArJEC2xiuQN5Ic+AgcLuV5pVGuw8itVvXDYFymBnJXtK0LKuGohUFgDY79qoV3/4WmLrudoV6XYm0bs9z0QiZ0Br1XJXGS7+qGLuKXU9zUlRb17xAScGauMJzeDRPWecVRSnRtrd3Y+7eGTCdCExUXq0hDGG9wPvkQ/Rk2O1vP6bHz7/9C1wk1xz19tgNd/ji4ee4zpYkVcIkHKIdxWvnb7Au19wcyJ4+0D5JlbIbjej1Y2HLtzK5ep4rbnBJQS8KeeOtx3zj7JyvvvGe7IB9j+l4yHjSx5vGsBfh3xkRHPRl4i5lytU9X+DnooZVSRgHbOYJzHM4TaXg2cn1RekdmMJnCrc2RTz2qOdZ527XtN0+flN2iXW+7oJi7GNmdWe286LO3urX6xfkfPY+NgWvxXzoOi+E6dC52ZlmQZmkOUv8c2PfqAyAeYFylPAXGkPIWxaUVSXP2baCZEwDOb9zkfGJnt6FoiaYxLRVy/piRV6UFFVFVdWMBj12JkPu3txHa8VqmeBrzenmktvDI/bjnX8pnPH+5s+/weVlwr0bQ/7wy1P+6GcOuFhmnCYpO4MQXyl+/munLIqCnUnEapWzMwrZ5BWjUUAQB1SZ0XEP98TtzTrF9YasrlfM5xlvfuMpVVUx3h0zunFAdOO2wOw7t/EO7+BMDgy7vAdBjN690aXRJUv5enlu9POLD8bJalfu6zhyDrZ4G2Y5m2uzA990k7rV27etwOq2wNcmsMc66NlVQdt0yIMt4lXxQa/7ujKw/Qt+AC/a7W6uDWTfSrHeXMvfTd1N/ZMb0NSo3ZvGHyDpbG77O91jlplZKYRd02E5CluiH10jZF+XyQko8kK8D4oMx3Xp7e7Q1A1VIWuc48sNWjmMYnHt/LDjI1n34dXXv6zbgnq+xplfi5rn+Aq3ScB1cXb3UUUCsyvcH/5DVG+9K7+7WYKzuwObDaoqiZyWaeiz3ogm/rKq6CvN3dt94ru7eDsD3Hs3aS7mvPPGpQTP5LJ/Py8rnuQlDZKh/ok4ZFZWpE3DyNXiEFrVPC9qWoSM5jhs5WObpiVWik3TMHG1qIxaCXkRVr3srRvEG37VNMyrhqmr0cZhr6fVdlgpW9m/D11F2UJmeVPm7wYozNDVIJ9tgmw6W5mzb6xzxXBHbHJD5VC0bKd+icrVosFvOkZ/aJoRe5+yBQfHIAOad9KCUMvthq4Sv/6mpaelmbAuefZcAuWgHYf7+32qrGbc89BKUZQNWVazXJWUZcN46OOOYnSZwnAo+/o0pd0keD/3Vz62jGXfa748y64pmoJ5vsRVLrNszqpcsRtOGPgD1tUKT7k8GN3nW1dvoRyFdhSzbMmmLNjkBXEvxA098rygzipIKkoPXr5/k93pkLgfsb8z4mq25L0nJ3iuS7pMYVlSL3Pqq4TWk6n18N4+q8sVzbLAHUc0bUN1nXZRrMaZcMt2t8YxVkriGFKdCZaRGFpTgKvGyFHqzurWwvDQXZAtXYa9heELQ7J7EdK3DHteeO4XezsHOY+8NgY8TqcIUM42gKfNjM7fFwld45ji7YrRT1vUqNijdRWcmfci1F3j4Rt0w1OiQrBNQymvt65qDo52SOoS19VMRwMBHpqW5Trhar5k2O8RRT7LPOf2ZExZl5RNhVaKLx388Mf2GgbIA/fL52nJclNwsS6JApdvP5pzlta8ctBjv+/zZJETBZo/8+lDvvL+jHVS0u/5Jr7UoSgb6d20J9aqVlaoPYb7U8IoIOpH9AYRi9mK5fExtaMl8CVd0qSJfF0Ke13v3hRZ3WYm07stYuuZsN8tq9z+3PrON6ZAp4uOqV/lUiAbYyyjRMfOeiYZ7SDs+7Y15jfrbjVglQC2YbAuedoFHPneKgHs5Ow4ZuqOOgg/GpikOadz/YOuoYiHglS8IDdti3wbUyu6/7zzz68KQU9ou2CfqpDXtiXjmWAfqyZwFEwPDRk2JxiNcFwfPJ+2rinXa2ociRcuG/b2emR5TdUI8fQv//Cd3/M6/siJPv3113D/8I/guJomLVl/7T3iP/1T8m9ff1f2847D8S99HZI18U/8IGiFunkEmw2MRrjDmLDnMxr59HoevnIYac144NG7NYHbt2EyYfOVb1Jcrnj5lQmHRz0CJQS0QDkc+S57rssnopBVVaMd0YkHxo3OdRxuB2L1OivrbXxsafbWDTBx9dZsSyEwfmMK93kpu9Ckqekrxb3QFykccvu0aYwhTctAKw58vTXTGWpnG6gVv/A1sLXWFemb7M6tTt82F6Lxd7b7eosWjAyb/sok3I2MTDBwHAITLBErhQKDJEBhIKFZ2XBVVcyrmruhx3UlTnyi4xfuwnnZcFyIB/9lWfG7J0t8Q8jL8orhwGc09NndDTk6jMVAx3dp8rL7hRuN0MOPN+z5O2ev8+O3fpS6aUirnF9877f4gYPPMUsXvD1/zKba4CuPX3zvNyjqki8dfZ7IDRgHQ270d9mJIgbDmF4vYm9nRC8OIdA4OyHjYZ/hqMfL0yk7vZg33npCXdW8+vAWu5Mhbuyj9iPZXR8KQYydkNPjS6wvvVKOTMexSabzlUz1YOBvRA9vD1tkXTNBV2ayT6pOa+4q4QH0TPqd4oNadGtnu31MM5FbpzzldIXanoc9yhd2+WA62xeQhaKRn9nz0E5n1GMCcfR+3EHyPc8w/h1ZZ1gtf9UKiXBdwm4k3AUr9Vub8J7cvO5K/j575wzPlcjnTZrhOA69KCQKfe7e2KeqKrTWOI7DZZJQtw378fQP4Kr77h//628949/80k2SpORqkfH6o2v+o595iSdnK3759ROSsiLJKv6H/+1N0qrmp75wgzB0ub/f5/IyIY49PN+jNx0LkSweS2GKR0TTCaNJnzu3RgyHAefPr2jqhsGNG5LVHo9EJw4mbvYAhnvGzKWWid8y7V3fxMMm3b7bcbakMrxQit6L+3HrCW/XABZm703k+Wzqm93pOw7bcB0bTmPDYqzTni2gdrLXbtcUWL6BJfq9WJTDvtmvhx20rz15PSsjL+yba2abW9/IpJ8uOzWAhfTt4yXzDsF4kVtgVQbQTfcXTzr0A7YhWkEcEU0nOI7EMTd1Q1HUuK4iDlz60YcbP31koc+uNtS/8U/x/tyfBcDfH8KdBzg/+TNipqMU1ZNTDr9wi+Yb3xT2fNOSfOUb0DTUj5/haAflaoKDIft7ETvDgIFWzFcl5flSdhHrNSryWK1Lnj5e8rV3rjktShzgbuCz77k0wHlZUrSyt25oOStKtNHXv5+VLKuGiZlYfSV+8lkjEbMgCXWhKdj3jNWreNC3VAisvjR7a8f826Hv4hmCW2F08SeFTMmxdrbe8oUh19UGvg+UyNiEr9SyqmXnLg2Hs33zbVJeX8sUPzSPCdKsjLTiX7k1pjbsfdHKl4YjIPv1wBHk4VFWMHQVu15HylvVNbcDvd3fV4Z3cORL43NZ1hz5Hi0t76YZF2VFWbbkRc3sWgIywkFIvBNTHF+j4hBWK5wwgsoYKH2Mj+tsyT85+U1+9v6fINA++70et/u3+GN3f4KyLlGOZp6v+MzeXY43x6zLNQ4Ov3XyBrEb8cbpuXiCpzlh6LMzHtCPQwLfpyzFsWpTlsyShCj02SQ5xyeXPH5+TnWd4nse4U6PMA4IfNcUZETfnlYUTxYCkTuOmOmkdedvb6VtddMR3QBj9yiWuKG7DXXZMvuLGpZF56s/Nixmu6e3MKZDt3qxu3qQ57ZTvfXHt0XbRuS+CBPa+7mmYei7gkrYpiB2+eSPvrqF/1ta4RTYFYNvZIPKEfc/V30QpVgWMA0/uKv3VOfqB6jdCIY+xfmG/HLDcpVQlhJsU9cNZVUzmQxYzCUbfJambMqM5+tzQsvy/hgf19cpf/+1M/7mv/EFRv2ALCs5iEP+2OePAOh5Lkle8dM/co/fPV3x3sUGpRx+881zvu+T+5yfr2nbljIv8QIPtxeLb7vnU2QFeV6R5pU48OUlm8WKzXLD/DvflinX88XUpW0ForcQM8jXy4tuV54susjZzXXnmFeX3S7eptNZlrolr9lgGpBCONg1u/JKiqlFBSwhb3Pdkewio5u3UH2eSOG1mfXWH79tu+Jqm4eq6CSANlJXe51dbW8CQY+jH/ySPFaVy79lq44x78cC3YOsGfJECnaZv8D8dzq1QJF2Ubm2obBkx80c1jPy62vq+SXVZk1++pSmbvBDnzzNiWKf2Swl8DRpXv2L7+gHD/fJn17R/uN/hPfv/1X55T4/If9b/5PIN65noBX6YId6nVO+8S7VbEP8J38MZ28PPeqjd8dkqwx3HBPsDQCIfZeX7g/pf+kV2qfPKE+uaMua3Zf3th4dP3g0YlHVnJUV30oy3k5zgbINtO05DkXb8q0kZ6g1D0OfWCu042wlbyATt02k8x2HHWMa815WsucJUz/WDqdFvSW2DbVmWdfbJDoQclysnO0O3jVT+KySfXfadP72QqaTr30znETKMbwlKfpV224helvYXccxaIUyMjvRyr5xtt6irWuzU9r1JIa3aFtqWk7Lih3PZV0Lb+CylGkdYG0c/ex5V6YR+XQcMnYV72TS/cZafAre26SsNqXI65KK508WrC82eAdDIVqendE+fQJxjPuZVz7qEvqeHzvRiMtkzq88/1X+/Et/jp1wxHV+zS89/se4yuUivaBtW1zlcpXOebo6ZVUk/JE7X8TT4pXu+1Jkg9BnOOqjlEIrh5ce3OT+3g6PFwuuZytcT/PSy7fQWtPkFYcPDsg2GdnlmuzJnM3zJf4kwosCiF3cUSiEuYV08u5Rv/uN3JQmqAb5vWuR27YIpD304TTpiqWnpXjaohxqA+938PZWXlfU3V4prboGoWmlqbAXvf3gqF9oAqCT2nnqg58ged01D9rA9Ebv/uY7TzqL200pzxGbxkebAr4wBjlgJHZNN+HPjVFQbHa6K5nwg9sjaBDb3LKGSONOIqpFRl6UVHVFmhWsNinX1yt6/YijyYjZJuHxYsFFcr31vf84H7dujZgtUv7i//hP+W/+9GfYn8a8v9jwf772HKUUXz9dC9rbNPzady45m6csFjmff7jL88sNo1FEf9ijyAuiXiSa66oC7THeG3N0NGCxyCjLht6ox71P3EG7Gvo7eHtHBr5fy2RaFfi7B1JYQYJY7L7fFZSApu6iYi173fWlgNoi25tIkU+XnTbeatztLt7q3e0E75v/K9skbB9TiSdAXXb6ekuSs0dVdJK/uuoey5LvLCnOCzqHPdsAZOLxf/La1zonPwvp29XD1nCH7nWANAMvwvStgfKjoawm8k3nGGiliUrD7q2OoV8VEI+pq5p0kxJEAb2eT5ZVnJytubpOmfT+Bb3u67SgXufQ65H89b9B+J//dcr/+1fJjq9RkSeRsgA3b+LuDMieXuH4mvSXfl0m/N1d2jRlvigor6SjrOsG31e42oE8x+nFuMOIap2zfjLj7r0hrxwNqJuWG4HPUGt2Pc2twCOpG06LkqJpWNYNY635yZ0B66bhrKzwHZhVNTWyMx8ZM5tlJYx/5Yi+31cOtwJ3S4yrWpi6iid5afzgaxzgsqrZ1LL7TxopzEVrJ/HuArIZ85H5YNQID6CnlJHoyffK7NI9x+HQyIheLPKdD79M/yNXs++5tAhisKpr0zQoEsOcT+qGi1KaEgdJ2FOOw9RVKMdhWUmq31lRb61+AVyEye86DjuuK2z/Vt6/smk5Me/zvXtDyqImiqTYVcuU/OmVrGZOT3EGg4+6hL7nx2Uyp24bPOXyX3/1v+OvfOov8Vtnr7EuUwLtyzXZ1oyCPsOgxyxdkFU5b1y9zTLfMAlDiqJithCZaZYVFGVFEMhEP88yPKWIeyF5VvDsyRk74wFHN/ZIspwgDmRa7QksX1xsKK8TAKpFhhoF9O5NIKmoTtdS1DaGnV6Ynb2iM9Oxu/a6lYneFvpASSOwKuVCWBtb20UpTYOnPrhr/71+861jnr22pXJ0UbftP1PwbZSsRQVCmxHedM51A494xxh52HVAXouUMKul2Kd1p/PvucYsR8vXypFmZCChNVtDi0hul5+t5XvbURcN1SKDsqGYp7RJxdH+VHz4ESvZp+dXpElGWpYcr1ZCBPyYHxKkVJImBf/W//zb/I0//kn+j29ciIFO3+d0VVA3LYPQZdoPOD/fEIaabzySZL7aqoOSFWEcih2u1pAnVGVFWUoiXl0LJPz03RMGY1HZKKW6PbiB34uLEyn+yuzwDx7A3h2ZvG1+/PXzzvoVusm3rrq/4QVzHNXJ0ZKFFNsi6UxjUlEDbPPsoTPUgW4q9oJuVeA4HRsf5DnsXn67Wvhn0KmqkH9Pl+a6b+Q+B/flcYq0Y98XqaAKljtgYfmwLxB/XQkqoV0p6OHgg0ZA279V5+Bn36t00xETHYdgOKBaXqO1RinFYpFR5CVVVbPZFCytUdXvcXzkFe6OTMcTRRJH+9Vfw713EzfyCT/7EJIEFbg4owkEATr0iV65gb8/xAkNu7JuJH3taExbN4xHAZtNifJckrdOKE+uKK/WBHsDBg/2oGkpi5rLecaiqinbhokW+DxQilgpPOUw1IqsbXic5BQG4h65mqqFiSvFtWgxefJS8F8MqYmUFD8EkbYAACAASURBVMLrqmGkFQ7Q14rUMPGtBG5WNVsIvTJ6+HUtuvXGfOjZCblqZYfvK2cbrNPA1p62Z+x2Q+WYKRsz2cvfviNSOAXc9D32PI+1ybMvWmHj121LaqR330kLA/urbTRv8cIHcagcNob0dzPQnBQ1G9NENLD1KFjWNUMtTcW9QcirA5HF5E3Ld95fEEUuKvKpFgmOdghevglRRDOb01bVR11C3/NjFPbJ64Kyqciqiv/n5Fe4M7iBr1zuj25TNAU9LyZ2YzHPUZrDfsfEHochSjnM52umQylYe9MRVVXjeS6Pnp6x3KTkWcFo3Gcw7JEXEt6yXCdiuuMpwp0e3igSGD32pOj1PZqrjM0qlUI1MCY3is6atmy6opeZnTxAz5XAlxbZVdv/dgu7+2pLdNvK3dq2g+/tFK9NYa8a+dpI24COUWohfOWYJD1DErQaeyvp81TnrtdCcNDnxs09knUqPgFNK/cJdGeuMzcQvm0wzL7+A+53LVvDHYndbWVnb1357L9rB/oe/iRidGcKZYM3CDm/muN5LmHgkyQZSikGwx73xmMi1yWv8z/Qa/C7ccSBS123JKuE1Srnf3/rjINRRK/n84P3JgAMIw9XOYSelmS7GyOOdmJ8X6OUQ5pk0LZo25iZ4hf3Y+bzjPk8JQxdJrtDesMem5XovvPZ5da+1pkcCCkNDIRt5HTLS8gStkE2RSo8gLDfPZeBv7dyNVuMLVHP7udtIbSwv+N0DHYry7PEUCv3A4HEHaeDyq0pThB3UzIIpB6POu6AbRDsXv3FQJ6qELc7PxLf++VFtx6wyIFFD9JlN7Xbqd8iD47q5Hr2/Smzbs9vX49VBgAEEfrg7paTkJ8/B0C7mnSdUuQl/UHI/n6ftpX//w87PrqVtTg60PvMLYpf/EfUj4+J/8gPQBjSPn+OikPaqwvIMpq8pLpaoT/zSbG1vb6mvFpzdBjTGKmOUg4P7g+p8hJvpy/2upHP8pkY8uR5zdCwjhe1MO6/kYgxzlVVsaxr5lXDG0nBZSlGL3a6Ps5FRtaYfbrrSFFf1y1nhdjFaqSALeuaqauNvl0CcDTC0nfNZ0tiNfOtaNuTWhLqWvPGZY0UfllnGv27Icbl5pxy0yRY85zA6OVBCvGh7+I74s6nHEEHHoQBB5FPZmJsoUMlirbl3FjwWl992zAsaiPRMz72u55ICy1Df+gqjnzN2FU8y0tCpbb5JctazIxONjlnaWFCeRrq1qxWN7mQnN46pXh8Jp3+eAjHxx95CX2vj6ZpWBcFrtK8NLnJPzl+jVW55odvfD8tLaWxxM3qjKRMqduGy2TO3eEtJuGQe6M9FvM1L92/wWy5pjFWwzePdlkuNhJlm+b0+hEXFws8E+SktcLVWiRvxxuy4yXlyUrg9cQ41Z0l3R67bqWQG0h6S7qzMPimkuLnKpne1yX1Its64Dl2Qg501xBYuRt8sCC37QfheNs82In9RUMd1+kaALuX16q7j6e6Qu2Yx/AUNx4ccGN/StvCZGfYEemsA9/SuoSZJsR15Nw3Vbdm0I4Ud8eco6e6qF1roGNRCkv6m+UU61wmeOVQZgVFWRL4HlXdEAQe17Mly8WGnucxjSKus+Uf3AX4XTqysqauG4Io4HOv7vG3fuE7APzsF2+RlA1pIamTq6zm/dMVdd3y6GTJ524NUUq8QYqsYHTnNkVRUVc1QRQwPDpkfjknDF3KoqRpWuZXKya7Q+qqxnVdCM3O+OIR7epanPKgY+0vjFGLnbxfLIDWnla5AlOvr0wwjgmAWV0aMx0zZQ/32brcWV25nXJtIIzS8lw23hWMYc/ig4XbGvhYc5wXGwSLLmhXnm/rTKe6fPqmhp1b8r1v5HE22KYqOiKfve3o4IP6+yrv+Al+JI2PPU/rLWAbAOsxUGYC9QO0LXWeSXOxPBc0wJcBJB7ELK+XzGdrsqzEcWCdffjQ9ZGFvrhcEd7ZheUS9af+PE1e0uQV6o//HAwGOIeHEEXkX/maGOr4Lu7hFLIMkgTnwQP0UOz8HOXgjiS69vIqI9gb4O0O0HGAjjzGrx5QXq6YL3LefLREOw6Hnsen45CHoc++56EQDfnU1bwceey4GgUc+K6xpBVinVjMOtzwXZ7k8uL3jP3uxvi7p41o2JUD87rm0NfkbfsBdjyIjK4wevlQ2aJqyNBmkrafyZ4h5om/vuzzI1OcrQWuhdSnnpgGgbHV1Zq+VvS0YmcacvdOnx/60bskRu7XII2GPY+eUROY62Hr1lc0QrYr25bn5rUvqmbLGdCOw4HnEWvFum5IG0EoImW+N/wBQUda8rbhYp5xcZlSzDb0XjXsW61pV2ua6w9PTPo4HCebGX3fZ5Gv+JEbXyT2Qq6SOZ+dfh/a0Qy8PrEX8mhxTFKlHPZ2mYRDXEdznS2pmoo4DqnrhjD0GQxiPFdzfjlnujPE813atiXwPXZ2hizmazZpzvnVgsB38aYx+993E3ZDgptS8PQo7EhykRboOtCdXz101q7jQIqiLYSRFlje2uQ2Qqprl4XY6kIH01etTPuh2xX1Fxn31kHP191KwDrk2cneNh32j2f2/9Z/fuDL154SFCHUuLHPeDzg4cOb/JEf+jTX16vOUz81FsCOI49jkQvLF7BNiG0ekqpDK6p2W9j9/V4nQ7RmOwMz0ZQN9TKXcyob2uucxWrDfLnm7Oya6Y4gMo/mc8qm+ZcCup/NUpqm5f7DXX7mU7tMJhHvn6/42VcPGEeaSc+jbhpef/eS09MVB/s97h3J5L1Y5IzHIWEckqc5QeAx2hmCI7LEo9u7stIMfCaTCO1qzp9f4TgO1eVztOehxntw9LIU1tG+FLD+1Ey+ZmK25Dm749aucaZzu2z4cCCTvZ107RRt4fJkztaIZrjf2csCW3vdupLHtCx21++KqV0H2GbhRTMeC7dbyN6S+upKbmM18hZlCHqEgx73PvOAL/3EZ82ufmVc9AxzvyrkNdloXfv8lpxnbXytBM/15fVb58DhnjRBVo1g/fBdXwyKkqVRH4wF4VhfU8yvuTy5ZDQdobSiLJuP1NDDP29Hv0jFsSqKKP723wbHIX9+Tfv6b8CzZ3B5yerXvkn4R3+EtmwIbk3B89j8X79Ds05oNxuarMT3xY+7vFzh+5KKFtyY0GQF+fNrkrfPKC9WlEWN72mmsbeFoK25y7OiMBp1gb4XVb2dmkOluCgFfm7alomruB24PMpK7gQuE1Ncz8uanpLwGZn06+2O/LSoCZVM3rOypjCFc123xAa2V2a1KSqldjs5K0cMeeybGRlNfGiKu+842/CZSEmITmSahqFW7Hkeu57LQGt+7CdfYffmkMGPfY7V02tqA827jqwgQD7zV3VDX8tjOQ5bFzy7Bjgva6PFV9wKXOaVoAAXpeTT3/Q9pq6E+nwmDplXNbuuy3lZcV1VW7KjwiFyNVXVMJ/n5M+u8V+6JfyK0RC1t/P7+qD6Xh1XScKN/h778S5/981/SNVUXGdLZvklJ5szlsWK78ye8InpQ+qmYRwMyKqc3zx5nRaR5OW5wPJVVTO7kumvLCvuHexSFhXrVcrTJ2dkWUFeVASeSxT6VLUYD52fzaCsyRcpaEeKUChGL0AHs+c1Tt/rfO73QvF+73nyp2jgOpcC5xoYfV1uoXLmRUe4c02xr5tuSq+aDiK3cbTQmeHAB+V1Lx5Wn182nRuetattW1TkMtkd4fiaP/VHv8iXXr3Pv/ry9zHLss45DwSBSKuOLGgncuuIF+iO3Z9W8se+P4WB+zclRVnhTyJpbtKK4NYQkkrkjBbNML8vehoZb4iKpmk4P5tx50gms+PlkkW+/m5fdt/14/JijePAjd0e/+XPf5skKTg9W7PISr5xvOJilfPtt6/4gVf2WM03vHQ45PnVhn/wlSeMx+F29x5EgXgLXK8IwoAiL9jd7eE4sLxe8t7bZ2RJRrpJaeoGNT3EUSLnojZ79Y2RitkAGetU15tgc9y3Ge9+JIVwcdZNsXYKtqxzmzVvY2gtsc/62FsJnp2ULbxu4W8bGGMhdOgMetq286FvWzmv4X537m4gk7h93tEBznAKowM+9WNf5NVP3+Jf//EHzOdZx5C3u3hLGCwzea5o+EHuQV12/AB7zlYp4Mfdf27bSGxw24gBj6OE4Ng2XZRuXcr6JBqCH9I2LYvZgp3dAZ6n6Pd9ZqsPj1v+yELf/4nPU5wvYbOheD5HxxIfm/78L+D8ub9Ak+YURUM7n9PkpXjfZxnuIEQd7MHZmXztadypTPD+tMd4FOD4HvU6R8c+i6uEOsnRyiHLKzZZxbyqOC1LsqZBOzDSmiPPM2Q5cY4bmYlYJlJTmMy0nJgJ9rpqODP+803bcm0Idpa97zvO1jHvvKhEX4/s60MzlSdG+161nQ7ffg5aBzyN2esbwp6Ehdlzkgalr7Wx6XWJlCJWmsBR7EYeg57H1HPxbu9zeSyyw28+XphAmtY4+nUqfddxCByRFnY/k5/fDXw2dcvmBcnf0BgEHXia07JkXdcow/JPmoaXo4Cxq/l8P6avNUOtGWnNTugRxy5tA/2+J+uZJ6c055c0sznp6+/8cz6ivrfHDxw94O3ZMXVT42sXX3sErs+vHn+FP/PgT3OZzlnlslfztYevPfZ7O9RtzYPRPb51cc5LNw9Ik5wo8NndG+M4Drdu7In6w3eJ4oCqqinyksB3yYqSJM3JrzZUSQFa4YZSnHuTHsE03vrZO0O/m6yVQ2s94CNXiHSN2avbXPhAmwJYSxCOtZO1EH/RyMRct0Z+Zn7F7Z7eJtBZHbxtAFo6e13ojHFsAR6Y5mIvAu3g7fbwegF+6OEGHv04Is9L9qdj/l/m3uxJluy+7/vkyT2z9t5v913n3pkBBoNlAGJAkABJwQBkmiZlkHSEwrLD4bDDYT/IDluhF7/A/4Ae7FB4eXGE7HCELUuiKNlcIBIiCIIECYAABrPfbe7Se9deuZ9MP5xzKnskzjhIWcRkREf3ra7uyqrOW7/f7/v7Ls8MR2RSshWOePvek3Z/7ojWyMfs6kE3KvqxTdMR2Jf28VVLwLNQuvrTlOIiUfftuOTjhOhKH9dx6FzpY29GuIGH0/EJfJVC6Do2QeDjug6Pj8+5SFNiz2NqoNIP8PHZT10F1MsUxx5h6BLHLv/dNx/wt3/mGc5mGbZt8eRixc3bW0Se4Kef3yIIbLb7AWdnK65eH9HUDb5vE0QqGCfqRghhrdcCySKh0+/QH/Wpa9WoViePIVkoiV2oip2Iu6yjXf24Lbxm7x73FXR9OZGuKlqXPENYy5Yt2Q1aQl6Zsw6xMSx1aHf5plDnmhdgoG8/fjc07/ptURa23ttbapI2GnfDjt+8Cr1NGikhiLm53+PkZMkgVBGwa2KcaUA8nT1vbHplpSSIJrnPEspBz6w4mka7/UXqtenvKFhe2Gpyt13q+VgV/VJzA8xO3w1wuj28Xm8dCRz3YrKsZDxOmU4zqkv14V883n/en0woZynV8Zj4Y9eRSU6dFDiDGCZnlOcLwkGIvPcQb2+odPT7+wS/9CWsFz8OvR7WaEjnsx+mOBpjOTZymfPwnQXZ/RMsz2F5tqIsao5PErxRrJouW1A2DQ4W40pyVkreyUv+dJWR1AqWvxV4aEUuk0oV7qSu2dRTatU0bGut+Kbu7CNbrPPmpxrOnlaSeSXV5K2n70AYBjxrQt1K7+ONDa45TPqnccZzLn3b7NE7tmDk2MR6eh+4NiPPpWMLuo5NoSe7jY0A69nnCEOHZrnEsxQSEdlqJWDke1nd0NPrB6PjNw5/jgXjqlqT+xJZr5sex1JkwFAo+d5cqun+uCg5KysOi5KybsjrmvOq5Go/oNtxqeuGjY2AoqiZTXNE6GLZAtGNcQfx+15CP+4jdkNk05DJnJ7fwREOszyhrEuKOmeWLdiJe/zo/E0cYfNofoRve/zs1c+wEYz42evPsxVFfOT2Vc7Op1hChX7ce3jIO+djRbqbJ8yWCStN9BLCIgr91r1uWSrnu2nB6t6Y/HQJvsDtBjTGW94Us0Whpl5z9LyW+W609Xoqbsyeez2NWy00L1D7bmgd9cx+3rLe7YDn2q1DXkP782YKF7R8nVJieTa2LQgDD9dRXyv7Xwff9/jo9nPYlsVFNmEyX2min9XK/C4fvm5UzGvV9VhH7JpwH0PSM5D+qoTtsF0F6N+dLBJyrWiQq4JymeF7Dv1uTBj4dGNVTE4vpsSdiKKosC2L4lJw1gf1yEvJYBBwMkm4uEgQwqIsaxaaae05gp2dDncfTuh3A944nNPxbP7apw/40oc2eP72BvN5zvbekPksxQ+U4mR2dMbJyRLbFszOZwhbsJqv1kXesqx10Arzc5ieKJe84wfq9t5mW+jNPhzUfc0+e3mhpn1LqEJrirwh3slKFUCz3xa2Ste7vJM3zHSDCnihun88aJn6huRm+AGXf8YNVIF1/daSNuqrnbcXtlJBQDgOju/x8q0BrmtzOC8pJxftGsLxW8Jc2FXPw2j1F+P2Mc35XE6tEzZrX/xsqdYfVbF2GyRbKsSkSLGjWL1uZQZRDyGEQlY6A7qjPuPj8Zp/sbkZqb/VexzvX+hdl/5/8zdxNrogBN5un+j5PdJ7JzS/9U/xr28Tf+WL2JtD6iQjef0Qlkua+Yz6G1+HOIYooj4fYwcu8zeOcLe67GyH3PvhCXVe4vs217/wHDdfvk6dVwz6Hr2ui4WafJ+PAvY9lzuhx55n4+kp+YerjMd5xXkpueq7dG2lKy+bhgdZSc8WDGybri3W++9IqMI3cGw8C216I7TGXKyLvDKuUUW21u8xrp5+z0rlkw+quLt6sq5plLTtz3gZq0YVVwdLkZeFRRQ5a0/6smlwHMHOi1dokhWe1hL7QkXKqvdYiw1XNS6epfLnl5oIWDQNF6VKuAN4lFfrlcSmazPTsP0N3yXRhd6cd9Y07HkeUvMIjP+/hcWjWcrD84Q0rQhjl95GxOhqn+SNIxrNYajz95Z0fBCOVZnyH7/4K4zTGYnurG8P93k4O+aPT75Nz+/w8t7H2AiHFLLi1fNDHs6esixXfPPpt4ncgAY4WyUEgc/J0QVhFLC7NeTpkzMsy8L3HH7qkx9id3dEXdfEYaA88XsewhV4G5EKXdkOlV+7Tr4rDxfrCTvY6yEiDc+DKvxdDdkbODvU7nlVreRpltXuvo3e3BVtt2kscA10Dy0J7/J7Qi7bqd7i3d2qaRBANR1CYFkWtlC+/5n++xdlRRwGPP/sVWQjCRyHqpaM+h36V4bt7w0uMYxdoTgEpunIKoVcBHbrYw+qEUm0IVDfb5/vGtKvFXqRSUTXI80Krem3WM0TDo8vWK5SNRgGHtcOdjg9vsB1bbq+/75vkB+k47/+8h2qqub0ySknxwsOdrvcezDh/3z1iF7k8W98ZIe9vS6uI/ju9x7zmz885mhe8A//9ISdQUgYuqxWBVVZMRvPCeMQy/cZn8+xLIveqMf121fW077ruTRNo/bDwob+pipevW1VsLsbKpXt/B0FR1eFsrGNegpyThcKcu/vtBr1oMPa6z1btrt7Y5Nb5u2O3ITRNHWrUzcGNAYVMI1FU7foQNO0bHZzGO99I9GrJeQJwnUhjNVngGxF1I24cn2Hi1WF6woiV+BvbOlJPlK7eDdgbdVr6+x6YavbjBbeEOvMhG/88s05mzXD5UAfgzD4kUocNOuHyRHFdEJ1cQIN5GnOxt4Gkwu1Ds+yqpVQ/hnH+xb6erGi+d3fgjCkOpshPvESYnuT7meeo8kLrF/8FRhuQBBgf/YniX7uJVXcDw+R84z5P/g6WBZykakQkL0+lmuTJBUf+tKz2IFLdGubw99/m/ThGaePZ8zmBZVsuB76zKXkqCgRFoxL1X0/KUqWsmbLtdl2lcZ+XkkeZAU3fHetE+/YKqWtQenhTwvJg6xap2p6QtCzLc61GiDTEHhXQ/aFZsnXKNOdQMvXzArUTPCVZvc7lrUm6G26Yr0vH7kKBo9tQc9VjYpjWywWBV1bEd66gcPWplpxUJa4XcWs7MbumhQoUCsK1Xg06yn9opScFJINV6y/71uWZvArMqGF4gs8KUo8C97JC0Kh7HUDSyEBnm6CTsqSlazX8b8dW9Dvedy7O+X1V89oCkn8kQPyJ2OaJMU72Hy/S+jHftRNw+89+Rab0QDbEkROQCErbvb3OEku+OvP/gqb4Sa+4/Li5of48s1PcGtwlXmxwBaCrz14ha2oy1I7rXW6EZ7ncHw24a986gUi32Nja8Cf/OBttcOfLVkmGWVZ0d/sUecVxVQRcZp5obrGo5Uq0KGtipWwVCTreaq08RZQ1SqC1ujLDYQ9zlRz4Gm43kD5smkLoAnBuWx3K3g3A99A9YbdLjS0blktAmAewxWq4Wig148JfQ/Pc6kqiePYyhvDdYijgI1IyRSbpiFwfEabfebLVSu/M5I8Y6YDap+eS1XYjaWvY7XrBUOyix0414E/J0nL5K8bgn6E6PvUF5lSOuRawQA4rkO/F3N8PubNu49Jk4yDazucnkw4XCzYDD/YXhCgruP//btH3DkY8JGXbtLp+lSy5qde2ufuyYq/+ZM3uD7wcYTgK5/Y4b/66x/n4zdHvH20oKxqfuePHq1ldUIIPN9D6LXOyz9xnY2NkMGow9nxFNd3yZKMMs1wPVeZ49gOLDVRbnmhitf4UDcAO2oPD6oILicKsjbJckXaTqzpXBVj8ztyHQbj6K8N2a5p1JRsJGhlrol7OnP+clgMtHwAs882zZth0hvin5nE4x7WYAsAx3GoC72+GwxxXAfXtZmmFbatzLHifqzOsanVlG327FXexs9mCz3NR62SwNj5Clvn0Nfq9aq0BHB63DYoQRe7v6HNgxbqo8zbGN6mxh5uK0Le+JxkkRB1IhaLnCyruLbz3tfx+xZ6MexDFCFPL1jcPaH8/T+EJCH54UOqaQInh4jP/jzyyRGsllg//QXEl36Z6vgCZ6vPYllCp4O7N1IFwRbYkcfmXofx9x9TTtQL1B+G3H8w53CcMVuWjOc5T7OcLVeZ2sw18W7o2NzwPQSKDKd06ybRzWZcSfK6WUe+hjoA5opm5fvCWhdOQ6IbOIJEs9ULrUM3cHiif5ewLBb653xN5DO6eTVRQ6wf07w/DbTZTd+2GTmOMuApKiaV5DgpOCsrLqpq7eDnH4ywP3QHZjOQDendE6bLgk3XWbP9B456PpFuRox9b42S30X6nGpgQ0fZJrJGaMSiYws2XYcDzyAmNrZlcT/LWckaTygU5ENRwDXfo2rgrKyQdcO1a10+/PEd7I5PcTonfOlZrF4Xtrbe7xL6sR+ykdhC8I1Hr3B/ek4uC1ZlyoPZEVmVk8qUZ3rPUsqKpmn43N7n+Pzez3CeTvBtD1nX9P0u2xt9ru9uIoTF5rDHjYMdvnf3HSazJa7r4HkODx8dMz6eUJYVeVEyO5oS9yJFvNNFTfQ8vGt9VYxl00rOmkYR8OZFS0oDVfjSSqECnt3uuYVoC3rstrttU7yNZe3lYm9CcsxhXyLoCV3sBa0jX2irNUHo0O1FOJ5Dlhes0pzJxZzZeE6+yNbWm/tXt/nZay9wno6xLIs/OXyL46ML9rZG60bhX3LTE2hzHakjbTVJz9X8AosW1jeNjSuw97vYsbdWI2RP59S5Ju71PJwrXaytAFydd9807GwMefaZA8IoYDZd8tyNKxz0enxm76V/7dfhv+oxnud4juCtxxPG4xTbFrx9f8x3XzthvMypm4ZPH2yQZCWztOLLz2zzn798Hce22OopM6c49uh2fTa31b47ijy2rmzx+tsXHB8bQzPJyeMTqvEZju9TzOcUZ0e4g9GlouyBc4ntXsvWJc6oO4z0rWnavbvjqYlf2DrxzYTBaOMYw6SvL0H5lwlpZc46KMcUbNdvf36tSdc7fNtRP2s7qrh2N5XpTWcEyUJ5xk9OqVZLyJTvfFmURJ2QL376KkVV43k2X3/9lNVs9W7HvHioCrsp5MYb3xAEbad9XQw739x3eaGage4mbF5Xn7UFsDx555Lt7qBdDQBUBbKSRHv7eKNNgiggSzKGw5CdnQ6fudl/z+vHec/vAPV0juWucL74RYadb1JNV7C5idM/x/3UizR336RezLCERfq1PyAEGtfDefYmLJfK/e7kBHo9mpNT7NCjmiY0ssZ1BU0pefrdxyyWJVXdcOtKh0qq1LRypoqua1mspGLBf3+ZseEqZv24ath2Bb6wWMhmXfCEpaZyA2Mf6/13oT3vjUTtqu+uY2U9/ean1oANgRAYHm7nUlCN2YEvZIOD8tM34TSXj54j6AglcbMtxcJPamV0k9Q1Pb1SWMqaTsfFsiwsW9A8eoR17RoyK7Asixc+tMm3f3TKpuuwlAXnZWvDm2nCX4NqMmrUKsLT52jkdoY8aFsWW47NvbRgy1WxtEXdEHoWe57LO3kBsqZrCxZS0ndsVlIykzX1+QpvnFA38MJ+j+2P7MFyCYOB+vwBPsbpjNiN+A9e+AX+4PBPsC1B7IY0Tc3V3h5Plo95tHgH13b4xtNvc9B9yEHnCs8Pn+E8GyMsi6Zp2O10uHcxxnFsltoyuNKrl3t3n7BKc8pKcv2ZKwCkac55VVFWEupaFau0os4khSGaZVIVUhNJa1jttZrOm2WpdtbLFcU8a+H5rIJZjrfToWhoGeaGwW6IbknVwvZmx25Z6vvm4nHtdm9v3pRBTc+Whe3YyEKx1WUtqfJ63RwI36FeFPiey972CClrni5O+ejW87wuHlHVNT/ziQ/xj77+x/Q6EXPDojdGOMKQYBrwHdbhO47VNiyS1p/fstTrdZoijb+/srVEbIcKEdGIQVXUa0Y+Zc3keAqezfHjM4a7A25eVx7xL+08z6pK/rIux7/wcf/uGYOuz1dePuAf/NETPM9mtSrZGIV8+pkRv/bmKbtdl9Wq5H/512ehygAAIABJREFUnQecvFwyCB1evNrnaJaTp+qa7XRcLi5SsGC1yhU0D7iuzfhiiawktm3j7uwpC1wLKktQLhYKYp4cqqI0PdG2rpr17seKqJfO1eTqBjqX3Wtldoa4ZpzlQBXDzQ1lf3yZtFaX2v71UsE0O2/Dnjc7bUPkMy58rt/K9uKBWj3UEsoc2x4gywyKlDpdwMYVbNtGnk3wN3eQUtLp+Pz+D4/4a5+5yvfeqCnLmhdfusF3vj5TDc75I4UgGL29IRoaToEpzJZQ5y9LdZuR9PW21X2zpXptDEdh8ypWd0gzOVG3zU41jB+3/Ic8IckU8bBYjRjt72Dbgtu7PY4Xf0FnPHHjGtV4RXN2rCYIWdM8eQqWRf3m29TvPKZZLhBxiPBduHaL5u7byloxjhk9t6MiTR0Ha9BHJjnF6Zw8KXF8F7sXsn1rSBjaeI5gOlUs/nvjhHElOS1KurZg4Djsei5XfWed8HbdVz3Ka0nOUiozmX3PRWDxWOvHQ6HMbWrUe0ckLDZdQVY3zCqpC7fylgfY0US9pFYFT6AahaVUDUfZNGtGva0ne2EpH/2BI9jQU7xvqVjbhZSclxVPilK/Hzn4GirfiDzyRjkF7l3t4e4MKU/n1Hfv0/2rL9P5zHMcPpmzHXq4mtDnWWjUQf19zkp1nh1brFFO9dyUcx/6ufccwaySa93+Qtbsex4j12ZSqfWIOXeAi6rmuKjY9VwOPJe+Y7Mf+zyzEbH94V31xhxFUJYqxfADfHx06zklraoL5vmS0A1oaEirnHfmT7k/f2dtkWthsRft8P2zV1XymRtxezRir7NFLiWbnRgpax4/OgELwtAn7oRsbPTXEPb5eMZ8mXD66Jw6KSnmGZ1+DEMfaz9WU7thsm8F6uuLjLWH/XbYMuHNbty13+0jr3f8xTLHdu1L1rX6d1io+5t9eNW0UKZh7kNrcmP2+r6tzq/jai/8BrnIIa1YLTN8z6M/7IJrr5n2OILdrSGe73JjOADgzcl9vnD90zy/cYXvvPWQO88c0OtEWJHDencGl1YNjnYE1OdqOAOybs10ui6MM2xP8xRWpQq6GfmwLFWRjxylCjCowaJQ7oF9Dzv2cDyH3laPWzf2cF2H3U4H2Ui+9fR7f1mX41/4+Pf+rQ+TFhV51SiOT+AShg6vvnrEa08XvH44Z55JikIyHAbsdFy+83BGP7D59PUuo60+N7Y71HVDTzsqTk4nOhnNotfzVVNXSeRqQbZMWJ2cUh3eb+Fpo5OP+poxrz3cOyNVXJfjVrIW9mCw1xLNjJ58eryGqddM+sWkNa1xA1XYTMZ7XbUNqGkUQs0BME1pmbcufUafbwpk07RyPkCmKQQx4dVba7Kg6ysXvMHWAMd1yPOKIHD42isn/PJPXefmbpfDwzk4Lk7ga7MfjTwYEp6B1x1PvV6gbjNEQdMQ+LGa6KFtruOh+rnpCc30TDcFGunIVzo1T2jCYYDT7eNs7bN76ypRHNDpeISe4Ftvnr3n9fO+hT75ve8xv3cG0ynV2YxyuiK9d4p3Y5f8cILY3lRMe8dBLlKar/0GxeNT5Nv3aZ4eUp4tSL7+HXBd6vMx3naPRtbUdUNdVtSrnKO7yov54KCD7VgslyUdW5HFLirJw6zgtKw4LkqdiKng9FVdM64UA/9OqMgOhuG+6aqiFQg18YPKgD8v6/VUPK3k2lpWoqD2SSXXMHeDen9RGnb1epiJXhVetbcfOOrxBo5DWtccFhUnZUVe1+tVwNO8ZCEl46riiufRtQW5RhrOz1PsyKM6nWL3Qh7/wQMALn7vNXzPVmzbQMkKhSYL1po/4FrGg1/t7fNarTMMP8Dk3pu1Qt00bLn2+g9fNxALwbOhz0yq8x06NndCdY4nZUXWKOi/KNXfLX8ypqkkFAVUFeFnXvz/eo/6sR5ff/SnPJ6fc5KcIpuad2ZHTLMFW9GIWbYkdkOFqGCxKhO+8fRP+NPjdzhanXCRTiik5JWzt/j4zjXOFku6vZi6bqhKiR8ov/uzsylxGHB1bxPbVs54wsjmMsnyeA7jjOZMT+WFhtDH+Zp0Fu31oKzxPXcNTzsdX8HiZio3UPtK69ALqTT5Dao4Ru2KgNhtnepsS039hlEPqnnwxPqx1kW+qtV0lVSquTdpcRcZ2XjFbDxne2vQGnTkkvPJnKqSPJrNsIXN1x++RSYzvnf8DkJrsLe2BtjCbm1yjXQOVEFPK/W6rB33dOH37LYh6LjU60kQJUnU8jpnK26Z+ha42x0IbOUemGsuQS2JAp/5bMVqlZJXFYtixfX+3r/26/Bf9fiNbz/m9VePOZypNcR0nvH06ZxeP+J0krDVC9TarlbRpf/kByfcfTTleF7yW69eMD6b8aP7Y+LYYz7PiToBYRxS1zWBzijIkowgCgg3NrANOa27+e5gl3SuWOHGKAZ0MEvSwuhlht/vt0Y38UBB1W6g7uPHugg2qqhli3YarqUq8IbcZ8hul4NtssW7LWzDXmsyY7tqgvfClg9gu63JzuIcpCQ9OSIYqEZDamTu4uiCMA6xbcVk/8F3H5BVNd9/84z5ZAG2zWBzoB7boBhmheCFrROeJdrzL1J1u+ErrJsZg0Joz4Detvo67rfyPdtRJkX9nRbRyFZURaH+zllBUVREvsPZIudgq/Oe18/7Fno79PAjF4SgriR26BFc34AoUhO846iPUncrYUidFlx8/zFWr4vd8RGeo5z1tjawAh//YKQkAY5NOVkhLNjcCPC7Abs7Eb2uIqAttEXtht5RFw162lYQ/bSq2fccrvouj/NCkeVo1t7vNfAwKzjwHRKpfOI7tqXja1mb4BS1in+VTcPAUZaxWV3jW22DYOBws3+P1ol4ahVQNAoFeFqoFcO4lLyZFjzOK57mkpGrHPxiW/Aoz0nrhrOi5LyU7OxEXDyaKsnS5lC9D69W+KHD9q0Rzz7TJ8mlStkT6vE8/XqYKT+RzTrRb9tTz8GzLCJhcc13cYXFyHGY64CenqM8/S3Ue/9hoZqraSUZOA6C1mNgaDvYtKu3o+MVlqMQm/zNx8oF8QN89H0f27I4WV1w0N1hI+yz391mL94icHxyWZCUKxUcJCtuD65y0Ovxw9N7aofXNHi2y2Y4ZKvbwXFsrlzZVNr5slL2t1FAtxvS6UZsb/TxXZd6UVzyk7+0/zZFGdQOuutCxyU5W0Dskie5SmrTDWq90I53stHFTRvWmCJpyGtmSjefza7/sqTOwOXA2uKxqFuugM51B9R6wBD/Jjn01K7eDVVeuZQ1Wa786zcGXcbjOR3PY7+zTcfzGGdTBkHAp+7c4BM3r5KsMiqpmxGDKhhJn2+3PAHTdPiXiIA9D3wbq+fRTPJLTY9uYtKKap6p6V42iMhdWxUTu1iRQ1GqNMaiqriYLnAcmxuDLd4aP2Er/GCbPgFc2e0QxgFPxyt6HR/fd7h5c8iNawPlY19UCAuuHfRxXZsrw4g4dnnl8ZRu6PLhFw+wbYtO4BJFLr7v4Ic+RVZQVTUXFwndfkx30CHshARxgD8atc50xhjHwNLGuMbxWhi+zFUh9ULy5bI1kTHxs51RWwRN4TMadFP4zWdTuE0CnRe2O3jD4LddtfNf69r1hymefqwKvGHbG+vZ1QTiAdliBXVNmSpiX2+jx2KyQAjB1kAV4gud4vnsh/YYbg2Znk3bIg/t9H4Zsjd7eT/W9rlRqwQwNrrTE9aBOibe1w2ULLFI1YoEUAmDjvrdruY8LMbUsmYxVf4PN7Y7PDlb0Q3e2+ve/upXv/re37TnX/V/8pNYH/kkTtfDljnpm4fqbytr7IM9mnv3sVxFNhNbI5xbV7FOz3F7Ae4vfwVnfESTplijERQF1fkcsoKqrPH6IaFjKbZ53XBysgIsLpKSgePgWErmVTZqir8oa/qOilIdOTYTWZM3tSLg2TYLKYltG4Ga0AI9+XdtwVTD2WWjIPyyUfB7jSrAsW1Ta33RvKpxhbWOn11qW9uqgaXee9uW0tYbC/FxqSbfpWzImmZtxBPaFnOpGpSBI5ANdG2b2LYJhCAUgl7PI3rugGa5Iup4FI/PsGMfEXhk4wRZ1VzpBNxfZpSNWltGOognsgVloxofV5Px0rphx7O19a567j2dztegpv9VXWNbynDH+A6MK9Vg2RbsuA7PbMfM0xJXCDqxSy0b+j2f7o0NrI0Rzt4mZBnii7/63/753rb+8o5hGH71Uzsv8rGtFxECXNvmeHVO7IakVcZOvMUfH73CZjSkqEu2oxH7nR2OV2fsxZu8tP0CkopxOgVLsigKzscz6rpROmxNBm0aFHQ/mVNWkryROLFPEwhs36Uxe+a0UoXbEYiBT7Mo18x6J/apK4nTC6gFCMemcS019XdcHekqWl29KXig/O59G7BaRn3TtA2BefyyVtO9KfhGm2+6PmGpIl9qkp4n1GSfSUgldtenLircUNnP2rFHXlTs7mxwdWuELWo2w5hH83M6nkc/iJjnGY+Oz9nZHnH+znmrpZdNC9mXjbbARev8UTwBYw+cSaVCSPUUv5709XOrUc+ramjSisYGNw64vr/NZL7CsW0C31UrBCw+9+KzXO1tcaN/hWWZ8BPbL39gr2GAyLe/+m9+fI9f/PAenY7NeVLx+HDBxiCgqFTy3A/ujbmyGWMJi27ocmu3y+E4oRe5fPLGgAqLowtFgG4amE1X+IGSF7quTZoU5GlOLWvKosRxHMrVSpnAlIWCmIuUdTpcUyv4OujoRlK0RXk1U7cFHU3e8yFfqinX8dvde5GqZsDY1XY3W2IetOEw+Yq1L7xx4jN+8o7bNhsmGIdGy9ystknxQnUOQQeqEooMrz+gwYIgoq5qhttDtrZiJrMcP/TJ6po8r7iy3WG6KFnOlogwpkkW6rEvoRjrVYGRARr2fdRrC71JvKurFvrPV6oJMjp7Q140cL/t0t3dobg4A8dD9IY0tcT1PG7d3mbQDXj+SpdVIfnFF7b/zOv4fSf6/O//I5pv/nOa//v/orn7NtZHPkrnf/ifsG4/w/L+Gczn1KWkXqXK496yqF5XTmnVPKX6x79G9uic8lRpF+V4hmULnEFEVdacP54jIo/5NOP4cEGaSt44W5LUNadluS7y00oVpTuhx4bjcFpIjsuKXVftvBM9lUdCkGo0IK/rtZPnuJJrCL5jq528geNHriq4u76Cx13LYktrzyfaJa9smjWaUNRqX7+QqsEwjPdVXeNYcF3DYHmjJu+FtvAttLmNZcFZWeI7gklV8XScspgX5A+OsTwXuxtw95VTghvbFKdzvMBhdyfieJZx4LlEtlpPCBTCcVqo9cNV36HW+/sdbU1aQxuKo1UDI8ehY9tsOCriFlhLDAdO+7oIy+JHxwtiIfAdQS0boshheHNEcTyD6ZT68VP4gMfU/sO7v8F3zr7H1x7/Dm9P73PQ2ec/eeE/4mrngCeLcwQWru2SlBldL0JYgh+cvYlsGg6XZ3zv9EccLc9YlilZVbHKC1zXwXFsxuM5eV7ieQ5FXnJ8Oqaua2bjuTLJuUhoMonMy9bW1vizzwrq0xR7FGpZWaMmXtemWuVQ1lRZ0cLd86Kdzl2hJt5cwfFW38NyBTsbQxxPv0Ea//hc6nCGS7tvA50bxMFI8QxRbuCvJ2UFsdf6sXSFtizyScKgG1NcJCxWCePxnLeeniDrmmHQ40fHJzy3cYNZlpCUJbduXuH0fIq3pYlFtkY4qkYV76aBgaeeW6VRi8sxvcJS03zTqNer47bogNl1mrQ+TTpsmoYHbz9Vxd2yVOEqK/b3N3nt6IRJNud4dc5O9MFWjgD8nd9+m//tu4f8j3/yiP/j20/5/J0hf/8/+0k+dKXHdJpxcr6iqpSBzlYvQNYNv/vdp1RVzav3xzy8SDmdKb9833ewbbH2D5iNF0wnK/oDtcZazVdk8wWrszPtuX7catKNvt2Y0BiDm2igC13dMsYtoWB9qYOazB7bpLqBgsHThTKz6W+radt4xxvP+2zRQuTrtLiqNcUxunrLamNjQUn8DPJgHrdpYHmB0x+B7VDMpoRxSLOak58ds5qvOD1dkSQFvu/w2itPGQwCziYp49MJ3WEXuZi2ITyXWfdGHmgUAvmqJdHJUofg6P9DjgfDvRYlMa5+BiUx/AVttrO4/5ZqGPwIIQS267KxO6Isax6cLHjreMlB33/P6+d9C7336Y/ClSs0kyn4PvKb36T46t+m+PYP6N7egbrG+fznsD/5EtYv/w3kkyPqJMffG1CnBU0hldf9x56DssS+tk81T2myit7BgH7fQ65ylsuSR5OUpKjY9lxuRwH7nrd2exs4Qk/fDY/zUu2ggeOyIm8adlybuVRJdhLW+nTbUtG2Rhcf6QI5qZTHvS+sNddnXkq2PFeR8PSk7wuLlVT2tsJi7Z73VlqoAmEpC1xhWSqCu1ZmPVuuQhXS2uTdq6J/VlaUtbLCfSfJuep7WBYMhz5ymcH+Pum9U0ajALlUu3u7G+CNYp496CJRzYKnd/JVoyZ6Iw1U4TjqOSW1Ss3zzE4WtWqYVhXTSnJWVgwcm1i/Psa7P5EN40ryVpohgVeSjLSUpFlFGDpc3L9AhApCEqEP7nvDRR+E487wGiN/yDid0fVijpMT/snDX+cPDr/D1d4Wsqn5pWe+xM3+NX719i9zno6p6oqBhuaWpdozfmjjFh3P45nRkCTJkFLy3PPX12+YqyRjleQkaU7ciwg2YrytGL+rpT8dp7WYPcvUtOoJ5Kpovd/Hmdo7S13Mi1oVf8O2r5sWzl5Wa6/5JpM0VcNilbCzOSAYRGvvfALn3c55ltV6wZsAG6k/68aBs7RNiasbdZs27CnPVlA3WB2X07Mp/f0hju2wtTVgtUrp+THn6YTAcyllRdfz2YwituOYj7xwE1czud9lxGNCdJJLSILJtPcEQSdUiIKx0U0rtU5IKxX5Gzkte7/S55tJqsMF+DbzxxNsWzkWxlHAZLwgioK1AkPWH+yoZQDPc9gbhrz+aMpyWfDf//qb/PLf/RZ/fO+C3e0OBztd/tYvPMvLt0b8lz91g5NpymgUEgYO3a7H6Swjyyo+cXuLxSKn2/VoauV8d+vODp1uyHyesZgq2ZkThhBE2J0eYuugZbyb4ht2VOEy02w61/70lZbhVaqIRX1Itd+78bI3vvWW1frCl7nS3+tmweqN1H0NBG8iaM3PmL19Om+LrpmaDRlv/FQ1DZdlbXrSri6OoWlwOj1Wx0dE2zt4mzsMNvrkWcn+lR6WZRFEAXWtDM36G302tnoMr19riXXQEgBNal5Tq0nfrBcMmdAoEHzNVzh/3Fr6eqFm4etzNcRDWamcADeA0wfQ1FR5TtgJWc0TikISeDahb1MaRvafcbxvoa9+8DrytTexBn2QUnnTd3zcjQ52T72BNW++Tvabv0fzh7+r0ufunzN/8xjLbV2Jih++pfb4RYEd+xw+mlFO1UnmuaQoajrC5rSsOC1KHiSZltXVFBqSWUllhbvrOfhCacAFak8tUaQyVxcsUP/vU1mriVQ3C6F+Y+k5NjZoJzzF6veFhecKtkKPSAg2nJbJXjZKk694VA17nkPRNJyUynrXkOBq1P2yuqGrkYOLql6n7E2qGl+ItQ7eF4Jh4DKZ5Fw8XcBySZlX7H/pRexOSJ2WBNc3qVcFQmviPUtN25fjaUFN96ZBAa0ysAXjqiKRKoI30i6Bm66z9t9v9O3CstjzHK7oD8+yKOqabVex0R1HsFqVBL5NNUuoZwvqNCf71vff9w3qx328efGA1y7uMgx69Lwek2yGI2x24hHDoIdsJCfJCf/snW/x1vQNTlZj7o7HHC6nDIMem+GAtMp5OHvKwI9ZlSWu63B0MibPFCkmTXPqukYIiyIrWaU52SyhyEvylTZvKXRxlQ3ioKMKWqgZ8z1PFb+ep4h2wsIyu2rZIIwtrCGnFbVqDAxxxLfZ2R4y6CkOwajfUQx3E2wDrOEts9s2krsadR/jyCfrNnjGFazz6iO3JQO6QjHudZ32XJs0yVguU+5OjpnlGf/Oc58klwU1DVtRl+PlEilrVsus9Q0Q7e9b/2czDYYn1Hl4gmy8au/vCYgc7A1thVpoNMBI8gYehLYKvOl5UDeIjYCyrJgvlE1xtxdRFCXLMsW2bP7ej77+l3xV/vmPe3fPuXe8YHcU8dyNIVla0jQN/cgj8h18V/C7d6d87ZVjfuveKfNlzuuvHvLgoWpy6qahLGvunygnNambztVcIQGua9PUDcIWCCGopmPIEuR8ovzXDenNTKhFpmD2NSSdtEQzM3VfDp3J03bX3NtqC58f6eKr9utWR4W5xL0Yf2tXQfyWUPt1aINwDDHPDdT3jHGOcdMzCXNm6jc7f2N9W2bqsSwwUbeykswnCyanE2TdIGXNr37xWR49mjKdZgwGIVVVk67S1tTHPD9T7JumXUsYCZ6xB4Z2oje+AlWuvjZNk3kOYU99Px5oSexKrU7m5+vf1Rt2kLImySqSrOLv/M+/957Xz/sWesuzaWRDce8pVBXOKKY4mSE+9jEsx4Esw/q5L6v7BgH1qqB/Y4NX70+R8xSZqLxob28Iu7tKl9sNOXhuE+HYWJZFXtQEgY0rLPq2zbbnsue5LGu5Ztmfl0oKFwiL+1nJSSEZV5JI797HpVznwEvNdG+AVa1g80lVr5Ppho7NUqrJ3+zIY9+mH6uQCM8TXB+G+EIl4MW2RddWK4RaT9SBJsUVtYqAXcqaji04L5Vnfq417LVGGyZVze3Qo2oallKq2wNlO5vkkqOLlChyqO4/YfjpZ0h++AC5TMlWBcXxjNU8Yzor+JnbG7rhUBeLY6nY233fWdvvGkjf03LAsd7PqxWI5ElRsZJSR+rWSO1VEAiLvmMzlZLTstKSRfX6dAJbE9PUdeH0Qqppwjd+7RXq7INtgdvxYt6ZnZGUGWVdErkB43TGte4Bge0zyea8MPoIpZSkMiOrcm6PRrzy5IhVmShSonDZioYEjo+sa7q9mGsH21RVRVVJqlLieS6OLXA8h9Ggi9sJcD23/Q+e6mm5gfpopYp/rovrRaYm9KrW+nXUTl9YsCqpC6kagI4OdTE6/AZwbbqxkvh14pDNzT6B77GzMWzXBI7+b26KvXvp3wL9u5p2h2/c+MzhKve9YBSr37UoyYuSbrdN4HpyfEHTNNybTNiOevzW/R+SlBlPFwuezCckWcFkPOfnfu6ldq1g6XOLHPVhi3/Ztc8w/wO7zQSY5MikUD9v2Pqxkt1FHeWBX5xpj/1MUlc1nTik2wmpNEnP81yquuZ//e43CZz3tRP5QBxXrw149HTOeJmzykpu3FSk5uf2uviu4Gye8V989gZS1gqgqWqeubPNk7tPmM9zskIq9LDjMewHrFYFnu/R3+izXBYkiYKIo04EFjj9EXbcVcUwiNuIVlPMVxNFIDPe7UG3vYagdYuD1uHOGMoYGB5aMxxNYPMCD4IIP/QJogDvyg1VsAd7qnAa29vL+fJBRz2ecdEzED6oczZhNPrx/ZEiBTaTE2Vh3e8jK4mwBYvHj3B9l0ePpsSxy6///gNsW5CmJWlakqwUhyHcvfJuQyADuZuVhOOrr2cnraLgMvO/lu9uUAzhMexpKZ3fEgihbVIG2/hRSFmU1HXDaBTRNA3f+P232bu5/57Xz/uz7j/6Anbo4gwiVdQ3NvBv75P809+lKUrkdAHjM/zrWzRJQnhHZcrf2e9SjFf4dw4QoUtxNAEpIU2xTSBK7OH7SooUBDZx7JDr6X3QUW9QVdOw73vcCjwkapfsWHDFd5QXPcosZqBtXjsaRvUti0g7wVnANd/luFDRq74Q9PTtrmWx2/Xo9X26XY+t6wP29rv0ex4/cX3AwHHYcm3lqqdZ6J4AG+g5FtueUB73trWGzpVxj4LqHcsib5T8bS6lNh1TRfcwLehEDkVTIxt45eFUwfeeh+XaOL/4SypR6ic+QtTxuPbSPmdnKRLlxOdZMHQERaNscFVBUh/GayDSzoAjxyGpG0aOsg12TVOg/1NOKsl5qeD8ji0Y6hS761q2eLQqFFLl22ujo9WTCT/9peeIvvzZ97uEfuzHC5t32IxiLMsiqzKG/gDZ1Hzz6Xc4TyckVcpJcswgiHln/phbgwOOl0v2Rn0WRcL13gGykZynU3JZIhsFd4ZRgOM4uK7ad3quCk8RwiLPCzpxgNRvaN3tnnJp84SaQC1LFWFTcE3Ge4MqdqYwa+285Qpt/iJbi1s97VuOcuoLAo9uL2Jvc8jV6zvcemafT770LMF+b22zu56cy7rNga+atvgL/bmh3d+7yj8DUCz7RDV2lZQsVindOCTNFbJxfD5hOlnw9kTpeb9w7XPYlsVPHbyA69p8/sN3OHx61j4/Y4Vr0A6DUDSaSFjWa86BMB73vg2bgUI2jAmQJ1TS37IkOZ6vvQaGByOizQ5+4DE7U2+YUeCT5yX73S73JxN+/sMv8FdvfeJf5yX4/8vheQ6bmxHD2ON0nOC6gl7P5xuvnvD4dIlrC350OsXzHL59f8LBTpckqdjc3ybLKl6+vUG363M8SZktchxH4LgOnY5aH9q2oCxKbMdWE32eI8sSz9fws+sr+1UTSuP4rd7bslShy1eq6BbagOhf9LEHNZWa+3mh+n1NA15EvDGkzEu1g3ZsvMBT0bpbu9AdqfsbDb+Btg1r3fx+I28zEL5ZIRhHvWxJPtVEQUetL/LxOY7nUC5XOMMt0tkcxxHYtqDIK/7dLzyDbQuev71BkRXc+fAB6VI3EnXV7tQdT7vlOa2/vbHrDS6Ff3U21Ll1Rq1JkIH8zfOqijUSEVx/VocHhbCYUOSqSSvykiyrmE4zPvTiAR/78PZ7Xj/vW+jJMsQnP4nl2jRFyfIbP6Sezhk/nmGFgTLTef1HZPdP4NEjkreOsRybnS9+FKfjk732DnVaKt31ZAL7+1SzlEbWWLYGL6mcAAAgAElEQVRQ8jth8egk4e4kwcLCswVv6gQqASSy1pauDtd8lyuew7SSLKVymZtU9ZpwVjZqny5RAS1q1ang7G3XxheCpZTYlqVjaAV5XpMkJa5OyXK6gboGHEFfW8R6uojGtkCgCvh5qeJj59qZJtZF3uTPO5bFrudQ6F15o/fprqVseG8NQ04XBbf3u1zbibg6DJFpgRzPcLohze/+NsNnd6jvPcCOfZA106zS0jn150k1crDpKkmdOYeeo6G6pmHbVWqEnnbiM74CtmWtGw9jLTzS0joLixPdMQJseA6d2GW4FeNudKgryWSSY3k2yf/zB+97Cf24j7qp+Yndj9I0DY5w+MHZG2RVzlmyYBSoKM63Z3eZZivKuuJHZw/oeh6fv/YcrrB59fxtlkWOrGtsS7ARhkgpSZMMx7WJ45CyrHh6csHh0QWVlDiOw+R0tn79FouEZlnidgIVyhLYyod9mreTuYHijcxNKEiewFZon5GiGeY+YHVdPNdF1g1JonzJq7pmK47Z7MYIoRLm1gY9NapAmrAbsyc3zH0D51u0aXk9TzUDoaNManw1fddpxd7WiJPzKXdu7rO/s8HVvS2EsJhmGVe6Xb59/F2u9YY8WRxzazik6/kslylu4LG26S00muDZ7etQala9JiTiCuppvlYeCN9Zr0EIHcVD8AViFBDv9dTtuWRyPiMIPKpK4vdCbFtJI29f36NpGlZlSexGfPf4zb/8C/PPeeR5xcduDKkbWC5LTk9X2Lbg7GxFnkuWSclvvjHGti2SvOLeoymuK/jkx/fp9Ty+9eYZ5+cJi0VOkpQKvq8ky2VBp+PT7SrDnMV0QXp+juU4eGFAvtL+7EGMnF2owtTV4TamyFaF+hxp7Xxno5W2maQ6UIXK7Ny1O50h81m+T7JMVGEPfRzHxhIWru8qQxvDrHd07Kxh55uCboJiTME35jm+lrb1t1UD0ttqSXRBTLVa4g03SJcp3e0NvMCjszEkWRVcXCTs7Hb5/ddO+fCdTR48nnHl6og0LRWfwPVb5GB50YbdGORC2Oq5XiYgmthZQ2gMdNyvOU9Q5xfEay//7OQIx/OU1M4L8XxPSSBDjyQpOD2aEoUu3/n+0/e8ft630Jc/fAMsi+zBGTQNnZ//LKLXIYockBL/Cz8Jkwnhl38ahkM6f+MXaYqK5bffJP7KlxCxYu/6P/uyclLLMuq0ID1fInyXYpZSFjU7A5+b3ZBNz2GsNd1lrSRg06riYVbwZprzdlpQaBLatutQNSqdbddzlbwP41/vkNW1YrmjkAFfqKca6ZQ6C8WAr+tm/YbcyIbgzhUGz+/RORiovGYh1u53kbCINYy/6aqd/VXfXvOVAs2y79qCs1Iyl1IVZmHR0QTBsmm4Gng8nmZYFhwdJ1R6erJsQXb/DHdvRPLGU7wv/xWyRxdYrk12OGVvGLDtuhSNKtSmsTgvlT5eAj1bkMianmMrTb8mYtUogt5K1jgoIqJrKU6DcfgzTUpDQ97UrKRkM3RxHEFZ1aymKU4v4tHrZxx8/Ar2qL9OsfugHq9d3MW3fWpqZvmCj209T8eLCByHhoZnR7c4TcZ84frL7Hd2+cqzX2RVlhwvz/nc/stEGlq71tul58cq8jbNWcwTbNumKNQEsr0xYHdnRByqfbAV2NhCKKhZ58eXT+dwvFpD6NZupIqsJ/AjH2qUhEw2BBpNcQJPy920XW6nTbRrCuXjH/guspLUdU1V13x85yqjMGTY77BcZeu9NoLWM97A9QJ1cVTNu3f6nnbj0xM8jqW4Aho+72/2ODo6x7YFT4/OKStJVUmkrHn4+ATPtjldjfnU7oscLqeUdc0rZ6dcubLJ7tawbShANSJJGze7ZtMH+k3beAfopL/axNd2NNEplxC5CEsgNVFQ2ajCZLYg8NVaBWA+X9H3fd54esznrz2DZVm8dvbejmIflOP0dMWNoc/FPMP3bW5dGzDoeDiO4PHDczb7AeNlzjM7PW7v9fhb//azLBY5T44X/Iefu47n2dR1Q7fra8CkoSxKVvOVNohBuUF2Y4bXDgiigKqs1Iq2yNRHmauCOj9VOm9jEBP21pOpNdRJdYFWV3SGbUOwGL9bBw+6MDo0ZaEsd4GqrEiWKTdujNjaGxHGIazmLYIArdkOqAm6q+Nys2UrUbusU19cqMdqGuUfX0vIVjhxh2KmgnwMolEWiv8wOV+QJGpqHi8yVqsCKRsuTudQS6wwbp+77SqyobBVM2OeX1Wox0sX6mtzPibJLu4rYmPTtMl+UV/dL+or+N52qGZj7DDECkL9aysGg4DJxZKdK0MePJy03hF/xvG+hd69tgtFTviFl6lLSfmD1yGK8HuKFd688QbWnTs0ZUn6nddZ/r1/TOfLnya8tQWDEXKeEn3sFhwfK2OVjurswu0udV5qJ1WHIpfqhS0rsrqhrBskDT3bZsdzuRl4fH7U5VOdkI5QcbPHZaUJbRZzKfG1NC6vFfEs1navnqV28ystucuaBtk0dG3BZtfjYD+m3/MJQwfhO+TvnCICh2qSsLcbc2cjIhKCpTbGuUxs7GkIUliaDIe19pjf0iY5nrDYdGwcjSIUtbK9jW3Bnf0ueV2TphXLZYlcZLg7PawXXiS4tkHzox8S//tfIXj+GllWMRwqXadxDvQ0BL/nOWvHPAXfq+fcoJL3lAeA8iJwLQuJWlvk+nYT4jOXkq52CLRQ64hJpiZ737Pp7vdBWNx8aR/vJz4Kvk9w64MtTep5MbNixvXePmmVc7g6IXR8ItdnUaz446Mf8OLG8yRVwh8dfp/ffPDP+ZlrL7IRDdiN9pjnS3biIZNsTuD47HZ6lKWk0w0pipK6rvF8l7KsKKVkMU9Ic2WiISuJtxHh9kPs7UhZQu/GenoXSi5WKfg6X2TqNr1zjqOAKAoUQhD7rfxtUShdfVbhxwHXr+6wsTWg040IQ5++7/Nofk7sumRlxa1ru1y7vqce83IRh3aqFqiiW+sda9W0YTtrApxL6CuiG4GN7zpEnZC9rSGVVNnlpxcz0iQnDH1e2LxJ7IWkVcovPPPTfGTrgMlixd7GQGUExK567MBevwbvMvSJ3JagF7utm6BvYzzxbc9Rxb+qoa6pVjnZMsX3vDUy0pQ1lazxfY/d7SFb20MGQcDHrx9we3CNp4tT7mx88A1zdnc7vHWWcWUjwvNsQs/m8HSF49hkScYff/cxn7jep6gkv/2H7/B3/9l9vvjyNTodj0/tDTk/T0jTkqJQ6hnfd1Rh78VMJilJUuIHyk2xyAvSuQq5aTI9sQYRzvY+bF/H2b0BG1eVpM7Ew+qJtUmWrZe9JXB8Da93hi1MbfbOWkdvxT2oG/zQJ+6rNVtTNzx9Omd7u0ORFXijzda/vpYtJG4OgyqYfTm0nw26ABB01DpIywDrugY/Uo1NUVHkhTIRKiuCOODOjSEA5+cJ/+kvPMvBXpd0la6T77BdbQoUvNvC16wPDFmwKtrYWiNB1E53ZKvWRhfUua6miLjb+gUAMlkRdSKE5gJ5ns3GVo+rV7pYlsX2znuH2rw/Ge/WMzTzOSwW2BsD5Fz90f39IdXFEjlLqV99DZoGf2+AM4yo3riPXOY03/k2/sGI/N6h2s9XFZYfqCkfqCYJx0cr5osC2xHYjsCzBNc7ylPds9QePdPM76dJTt4ocl3HFlz3PRwNg5dNQ17XnFdqD69S7GpCIdZRtR3bVsil7girBqLIJQgcdj57i/ijV3EGMd5WF7G9SZWXDK4NyDLlEd9zhPaVVyhBLBSLX9KSl7u2mogVAtrwIFM7b0WEq/joIGbHczkcp5RNw/3DJY5lMU5Lbv+/zL1ZjCz5deb3i/jHHrln1l5196X3JrvF5iJxlUhRqyVrZGgM2KMRDAvjhwEMGDDgFwMGBphnG2MY8MvAGsAYexZ7pMFYpCRLpChx6ya7ye6+vd57u27tS26xr344kZktQGxbtE11NArVVZW3KiorMs453/mWx4ZYO33qvCT/2jfQHYvoBw/I/t1XIc/RdQ2z7aCh0Wmc/5xmH581XgOe0kkbIp6OTP1eI7FzdX3piJc1gTZVXeMrHYU0S3UNp1lOR6mGoCcrh6phoDpP3yR5eI7uWNRvvQ0bG0uPhA/rcbd/E9dwqeqKnt3mPJrgmx4b3oD7k0M2/RFvTt5F13RGbo/LOObB9IBpEvCXx9/ENizq5r+iKuk7bba3pDCML2ccHwv73m0gYsMy6HVaXNmWfVmWF5RVie85TGfhStLWNtEHTmM720zWjeELruLi8JI0y5fJcChtVRQ9AyxFUZS4ro3n2nzxibs8ubbGTqfDwPVZ93vkecGzd66iFjr5Rb77ArYH+byhr6B6oyHo1bXwBc5ieZ+XhLOI7Zub2F2XNC/QNBjPAgylc3h6wUeevsloXZzavn34BjoaL5/e4xsHL2Iri1Gnxc1+nyTLoKreZ3er/1UpX734+e+T3y3O31ZCYjR08SdoInS1BuqnrElmTSpa0zzkhWR1x1HKl24+xv5sxq3+JmfxJR/ffmbJVfkwH5sDD0NpvHcyxzQVB+ch/Z7DaOTh+A637qzzjTcviLMSz7O4uIj405cOODsL+W//4gHtts3Ghk8UFUsE0/FkdZokOecnE7K0wDAMlJKp2/Vd/LWhTK5lSRFG8n58tiLUddaht7GCpxvdN2cPoSopjh5I9gnIfr7MZVJNo2ba16iLAsPzKIuSu09s8/Rz1xisdeh2HZTSiMOYnWsbLCNg30+CAymai0K/2MUb1mqqXrDWm4+L8TnO7nVw21RFgWEaTC+mFHlBGkTsXN9ie7ePruvce+cC3xfk5I9eFeSnv96n3WtTB1PZnReZvNXVSmZYV43csGk2TEcKe2vYOP95K9e+xXnavmjlG9i/ujxeoR8N1yGNU9q9Njs7XQ4P56ytedx764JnH19f/l3/uuMDC3391psSmxoEkKZYm13ZtQOq5VDGGZptweGheKDnJartSISt66I/9QTWZpe6STirLy8wn75LfDpHtWzaHQula5zPM6ZBzkVR8NY85o3zsOEHyTT6VpISlCVJtbKbTSuZIsKywtYanX2jna/qGreZageGZMBXSIOwuHese6bspz5yFU3X0QyDfBygDQcwm9F+cgeqGtc1GhmesOmDxm524S1vN/v4rJG3+Y2hTdi40xkaPEwL5mXFq5OQvJJAnXlZYiCSuasbPm+9fkF2NKG4DDFuX4PhEO8xOQft+RfoP7nN7DRko2fLvVkT1v3C1GwRSdtVirgSGZ+lyQSf1TVZJc6As1JkgAtnvIdJzmme0zckEthvSI62rqGAkWcyHDps3hqCruPeWBM1hdIhinDu7n7QJfS3flymY+Ii5q3xQ2ZZIKz7ZEJaZtiGRVlXFFXJRTzhNBqz5vm0LI+KmqiIud27StduoWs6LdNj3Rvw6Ss3yXPxT3cci7quOR/PSNKMIisYX8x4+N4J6BqmYVDlFbOjiRQmaExiCqqiamQSpeydF3C2KSY5mqZRFdWq2GfVikmflXTaHsE84pn1dTRNY7c94nA+Z6+9QZTHPLuzRV5VeL6zssZdEAAX6XWLF8SCpNcw+ZeP79ny2GkGccnhwZlE8B5PCIMEyzBIs4Ir2+vce/M9gnlEmubcHmyx3V7n7vA6HbvFs2tPcnc45KWjI67ubUhhXmj8F/C7xsrONxKfAMOxpMDX9WrCr2uZ5BfM/NOYepahNc+N7hpobVOeU12j0/IY9trs7K3jmQ43+n3KusIzHXQ0nlq7+v/fBfj/0WEonffOQ2azDNc26Ldtojjn5CRgfbMnGShKVhd1XQvr/kqPonHNu7HVYdB12d2S6S9NC/b2ugTTkCzJsF2boiiYjWeSdJdnRBeXhJcTmeo1TXbc8yaQ5f3yuXi+cn2zXIGiFzI7tyOFXtNlR2+5UqzfJ5HzumK69fFP3mLUc2l5Fo/uHzMauGRZya3Hd6WIKbPZZVcredpiJ7+Y9peT/UL76a7kaiBrh7omOT6Qx4VTivkU27HRlU572OP04JwgkIZlbc3n6nqLjTVZ2z1zpctg4InEzvHFMVDTpMFYFORFmt5iTeH3V5K/aCKPXUT+2v4q9CeeiXxuwX/wes2/M8FysIbrdIYduj2Xuq559ol1oignCmLWOg7X937Mib6azMQQRdeh3xeP88YJLT0YoymdKkrIjy5wn7kBwA//4DW8Tz8LZYn+5d+iSnK0mzcFtg9DGI/x9wZklyFlUZNmJV1nsVMHXxfSXNFA7Lam85jv4imdrpKinVQ1BTVblsWWZTa7cCHPJVWF25jIaBoEZdUQz7QlmdfRdaK0ZHPDIz0ck53NqDPRrOcPDmF9nfwypIwyfE809hqiO18oklpKJvywqnB1Ke6LJ3QVfKNzlJV4umjeNU3D1GW9sOvanOU5aV1xcSEQlO6YTC5jmM/RHn+Kch5hrndJ/uXvU0YZtqOIY9nHy6pA7o1BWS3NhUCKtKOvLHx15GcHZcWGqUirejmxb9sGhqYRVRUXDcnR0jUMNDzLoKyg13cwBi0YjYjfOUW3DLROB/Kc7OHJB9+h/paPh7NDqrqia7dwDBvXsNHQsZRJWmQUVUGQhQRZyE57HVOZ/JvXX+b5jSfxTY+99h6zNGSvvY1neJi6yf7sAsNQZGlOXUMYJQx6bbJcTDtszxYCUSFWorqp09rogNJRrrmKlI0KOv029kaL6iLB6kkaHEWF6ViUVYlu6GRJLq/BRdpcDTgGlmHQ7bXYn82YpQmu6fD85lXemx2x295knqayPrJMLNNYkf0W1rGwcsVb+M0vCv/Coa6R+KH0pSGNpgFlzfpaj9PzCVVVMZ4GmKaBrutEYcLb42N2/R2KqsBWFi+evIJnOrimSZJk6Jou+/8FUdDQZaXQargEHQvd0CmyYpUNUFQrk5wFGc9SMJQUwDoSLkR1mVBHBRoa64Muuq4zGHa4uzZi3RtyHEiYUc/qoGk63zp46yd9Wf6Nj4PzkItpwnyeYBo6vmPiuSazacTZyRRN04jTgssgxXEMirzgj/7oVYZDj7yo+PTNLm+9e8mTO53Gvrni8jLGsATCn5xNSMIEy7ZWBDyzgd0NUwqaYaGGWyvYfBGxmsUYgzXZRSdBU5jcxjCnI4humcv3KLKVjW2Rg+mQpRmj7RHvPZryzsMJVV3z5Eeu8uhwhlI643GMrmtYbclsx+utzHsW8a+Lj21/ZaYDLKNhDUum50Xkq+WiLDGw8TfWSc9PqOuaJEpkdVDXXJ5cMpkkPLndogZsQ/GH3z1gfeTjd3z59wum/AJhWBT6BaRv+9LoGM05Oe3VasNfxOe+z6q3SKUhWiADgGYK+Q5gtNbBcQyevD7km995iOuadHo+0zjj4ORHR4Z/YKHXex3odCgvJhDHVGlOeXyG6rbQXRP7N35V2POeRT2d4dzdZXPDgzSlnkyoT/dR3cYHOc8hTSnPx9RFyflFgjI0skZHP/JMFFIcB4bC1YUV3zcUQS5hMcd5TtrI2GxN5zTPmTUseqWJzMxq3O7SWjTiJSvNedawGvtNQ2C4FtZ6B9020RybKs5EFz6dYm/3sbf7TGcZXd+krWTP3lG67Mab+6Gv66R1TdfQGTRyN1PT2LQULV1vrG+FCKeaxzu6ziQt2LBMPF1xnOZsb3lcHMzZ+Mgu2Db1yy+htjeI3znGvrKG0fMwDZ1+32bTMnEbcqHZwPNp43IVNISMvGmUymYHL88B2LrOjm3i6DqermNrOoOGmHbVthr/e0FG0rzEtsWymKqGOCaYZVRp42qVJFhP3fqb37V+godr2Bi6sUw9C/KIMI/o2m1cw+bTO5+grCvyqiQvc8q6Yq+JW52lAVGjpU9L2buHeURSCGwdNg55vudgmQY7GyM0U8exLdYGHZHF6RrWwj0wKyVNbWES0zKZTQLScQQtg7JcOdaZhhLP9gWEvXDEaxj3vY5PEMWM+h32Oh0hF9Y1uq5T1hVhHrHVbtN3XS4uZuxsDGWCXnwfY/G+aR6KujGqWXjPVyLp6zS65AUxDpGo0bFI84Jut0WNuPKN+h3uPzzizq1dJknCd06+T9tqcRScYegKDQ3XNPF9h2Gv3RjyvG81UVRyHkkJeUmVFgLxL9j/cXPzDnPo2kIOVBqaoeH3fTn3riUkPUPDMBSzMMaxTEzLFA8Es4VrGJzHE7IqJ8xDbvY/3DwTgKIoaXkmo5FPXlSMg5Tzi4jdvT67V4b8ykc2KcuKPC957s4aV68N0HWdqqp5/cEljyYZa2s+p/OsURVpRFGOUoooiFCmwvEdDMuguzZAdz1sz8V0G523UhitjlyPi8JpOjLlOy2Jsw0uG0i6YcG7bTRdk48X8LrprHTuVYnuepiWhBC1Whadjo1tKnodhzyvmEwSXNfAMHSyOGEZZ7twzZs2g4blShOgafIzDFu+nsXQ25Tiv2DBN250tisSwTROsUcbmJZJHoTYrs3FyYTrd7bxPJPvvTdj2LY5OQ8ZDGSaLvKCMstw+r1VsM5ioo+mK7lhkUE0k7cFp2HhJTC/kOerNVyZAnk9eWxrIE0SUBcF8Swgz3LabVkjeJai0/NJU7lHRElBv+f8yOvng+V1aQqXl6hhr4FfNOqyht1dvF/9WerDfbSrV6iyknwSgWmy9oWnIIrQ9vZAKbRPfx7eew+KguJsgvHCT6EZiuHAxlA686ykKCqStGTbs9EQWDls9shFDUFZMjAMrtoWPaVoKynUQ8NAaRpOA517SsdAmOQC4dM0AvwVk5moqljv2JRJTno4Jr+Ykx+PUW0Pa70D3S60WmRHE7Y2PUZDV3bcTZGv6rqB6ht1UvNEurrGrm1wlpeMDANb1+kaOpuWydBQhA1fYNhE6I6LgrJh9k9nGdNZRhkkxN+9B6MR2i/9Hdynb6D//X9IMYsFhmybghTBkp8g0L0Q/SxNkvUWZLu40dMvHAPnZUlYVpLc1zRDaV3RUYq0CQCym+ahqiFN5UZr3NilfvCQtWd3ePjKsagowpDku6//2Devn8QxS6XL1TWdjtWirmuUplCaznMbT/LO9F0eH94kyEKyqiDIEj6+fYuyLuk7Hda9DZ7feJrD4BRN0zgJL/nlW5/Ati1sy0TXdQ5OLsjygqqq2NkYEScpk1koVqJNmEowDWkP2/jrbSm4jsjVHN9pwmSanbOlwDdJsgzdM+X1lpdNXC3LQJgkzdnbWWcWxRzM5zyYTomKhHVvwI3eLpv+iJbpEuY5V69tsr27JlO93RicLFj2C7h+obNvGUtzGqNlS1FXGvbAWxL6dF2n5TkC4U8CtIY+P5tH2KbJ+XjGW4+OudLZ5mPrH2O3vcGntl/gOBQS1rWNEY5jy3OQVSvt/oL1bzVcBE2Tr1e1NAGesSIIalAVldjcFzVxmgnJauEqqGlYpoHrWIRxiufaPLN+ja88+Es2Wh3+z3fexlEO42TG19579yd+Xf5ND9NUvHN/zPl5iN2sX/K8kshZz+TBOMVzTfb3pxyNIzRN43Off4z5PGVvs03PVXz+qXXOZgntts3lZczPfvwKtmPiuA62YzM5m6CUyNpa3RZoTYRrw2JXpqKKI3B8jI7s1+luQpGhb1wTiFrTBYo2HZhfUJelIANx0DDx9RVjXtOo8pwkSsjTnNPTgOPjgDgrsAwd1xUOVZqWHDw8w++1pWlQxmrX73VXa4I8XQXgdNfE615XGO2O8AwWKXkNTF6WJVanAzWkUUyWyJ49mkdCSswKHu2PCdOC33hmg2vbHX7x6Q0OjwMcz2GwtUYSNoTAuhYjnDRa8RUWWQCLCX+R2ue0VjI8w1x58/t9mead9kqHbzkYjo1h29iuTRjmdNs2f/ydfXRd43vffIvNzTZK14jiH21e9sGF3vfBsqDTAV9SkYztNZhOqSOB4cs338H8T/8zzJ6HdvUa5fkY7Qs/T3nvbYFmTo+kKGQZxlqP+vQEc+CjlI7rKjb7Dp5rMug7mKZGv21xe6fNmmkwsk0uiwKlabhKZ1aWpLWQyExNY1qWMp03RayqayZN7rutSaa8r6vGIlyjrcT8xtN1XNeg/cwe9nYf5dmUUUp+PkMzFNr2LnS76L6F0XExTSmUHbXY96+ePOd9DmLzsuJRKmqAt5OMuGHIn+YFjw98dJC0OF3jIMuWv9dpnnM2y7h+u4+ma9i7A3mO/+U/Q/vN36b609/Heuw6rU88tlw/LCTQi/MwtYUtrhDxsqqmpev4+grO9xsTodO8EOtbXVECYVmhmpt1WglzHyCrKwYDW+STnQ4X332AGnRI01KujbLEee7uB15Cf9tH124TFwkDp0tSpGRlgW+5XCZTvvLgW7x8+gYPpwf87tO/zcjt8dm950mKlHV3jQfTA6q6ZJ7N6dh+4/NgcBGPGXRbuJ500AvDGtsy0TSNva01djaHuLaFY5tMJgF6EyQSp9nq5JRGEqWSXZ+XsrfWETe8WYbn2GhKQ7cNqqxEMzSZVgHXsfB9h5+//QTXuj1u9HoUVUmQxXiGg61srnV3SIqCjXYL02wQA0NfMe0XEjaNlYY3KJYytuK8URDYivQk4OrNbQDSLBe292WI7Vr4nkN9mXI5nbO2JiTAq1trTNIZf3bwNX7txq/w1Yd/xvXuOi9sXWPe5GmTlnIudb2y510Q83RNPu+bLJL9ll73riEeBLVIYikqqrSQkJayRtdErlhVFWGUsLne5z988lPc7t8gLUuqSnwQLGXSsVt88cZjP5mL8f/FEQQZpqlz69aQSZCR5iWH751x81qfOM557dGEyTThd3/lMSxD8eTVPmlesbXV5vgiZH+S8fAiwTZ0NA22tztczBOU0tCVLiSvfhtd6UtXuuHGgM0rGxiWgeHYpPMA5fm0+22KBRtfSeNYlQ3Uv4iZXaTOJaE0YGUuOfaqkd7ZPhQZhmNjWia7VwZ4nsXGhk+SFJxeRkRRxs3dLmVZYReF598AACAASURBVFomftuXCTpPV6Q8ZYqGfTGtLxzwwqlM+5pG8fB1aTiUAaZDd3cbLJe6qlGGojg/BE2Xpju4BKDb3OcGwxbn44h/+q1H/INPXeXF/Rlrax6jtTZlUYokcDHNW+5KIleVaF5r5YdvObLaWDQ5ti9kxNk5+H101xOpXRLK81jkoBTKslCNp4DX8vjyx3Z4eq9Hp2OjlCgMbmy2ibOSR4+mP/L6+eBCv7kpu4X5nOL1d2RHr2kU7z6SfXuaotoe+f/wT6iyArwWmtKpv/4nGP/JPxB/47oG14XRSPb9ZYm2tSmdt6bhewI5uq4iyypmQc5bB3PGRcFRkuErnayueJRmZA2jvEQK5kI/n1RinqMBHaXQNY15Y/OaN/B9+T4pWQUUeUXy4AzNsbFu7WJv9zEfv0ldlNSvvwqAc2cX3TE5O495/nYfU9NQTTFdSI+LWoq9sO4Fvi9qSYLzGt/7oq45nacUNZznBfMmTU/Mc0osXW8icuH0jVMA6tMz4TV858/Qnnoeul3Kk4sl2Sava3xdXyZ+ZrX8/nmDEOTNYxbPVd00O0UNA0Ph6UJM7DdrkkkpmfcgTUNd1+wOPfrbHeqLCzg9pb3Tg1aLp//OR+G998jORJHxYT722tt0rDZBHvHHD1+h73TwDJnkerYjFsDK5H9+81/w7uQRO/42uq7ze6/+W375xpcavwUdR9niPqhMqrrmSlfMdhzbpN3yKPKSVtsjzwvGs4CD4wvSrCAIYhzXpqpqJo2JjrZ0fwPPdyShrqxXUbau7NODIF6qHoSdLEXNdi0swyAME753so+pxKK377S50t4mKhLemx8yywJu9we0LYtH753wU8/dxXYl230po1uw7o1mB+8oYas3+3rLNJee9GcXU6hrsnlCkuai5a9r4iQDz5DAGmA2DZmlKfMspKLiB5ev8O/d/EXuDG5wkUyJsxzbMleFe4EqwJJ5r1xzaXxDDUWUQVGJz0AtnALbtbBsU4KDTEWe5lBVlEWJpsn95druBlvbI47Dc6bpFF3TeHLtFn//E5/hLL7g9Yt3l6jPh/no913uXB8wm6W8/sMD3nn7gt6oy8l5iGnqnJ9H5HnFH/7ghIOzgMfWXZSu8fbbF3z01hqaBrMoo6yEqJemBUlesrHRIgoiHF8sm8uixLZlxz8+m3B5OqaIY4q8wHBdqqpifv9tmcyLTCbYqsQwDXTLbib6RIpaY/ta5AV0hgJF601hLGTqL0JBH87OQqaTkGHPpe1bDHsuaZIzj3M8z2Tv2gjTMkjGE5nWF6548Ux24YvY3MX0vEjUq2vobqAbDZ8gnhFMArmOxxeCWDgtSCOZ6LtryyZmcj5FKR3bNgiTgj98+4JPXu/yhSfWmc0SlKGk0C9Mg2DF7jcs6ovDVahN+T7HvmiKZtvS+Hji7V9VIrnT2n3hMjRHmWVUpcTnDkZtvvOuNCJK6dy61ue3futThGnBPMro9dwfef18sLyu0xPyXbdLGaWUQQpJgmrZlPffg1YL7Vd+A90xMH75V6hPj8gvQ2kO3nhF/ISPj9A+/XNSEDwPioL8zQfkWUkQ5kxnGZal0HUNxxFIvqzBbBiZWaNPX7jUGZpGS9exdJ3XooTLomwkbVKEy1p24TUsC/1iWs1qyY/f2vDYuN4HpVOO54JWdNowm1HOE9K3D8HzKI4vqbKC7ZsD0rRkq+fQUQqvmXijhgVcISvPo6xkVlZUtRDg3k0yzvKSoSFmORKVK+Y5cVXTs02Gtsm6abDVc8iijMFmm/TRJVq7BdMpyVe+Rv3a96jfeBPVa6F0jdu3ehhogrgCYVUvUYu8rknqinFRLkmNI9OkRAiEVdPw5I1qoK5lb+/p+tJCd9HAeJ6BbhlQVcLB+IXPkb+zT346gaLAuntt6Y3wYT1sZWPqBkrTmacpQRYSFQldu81FHLHTWuMLu5+mqmt+/daXmWZT4jzh125/hu+fvUJWZURFxGP9u0zSKV27jaUM3h2PqauaoihJ0gzDFAa+59okSSapc4Bu6CSphBLZHReqGqUrdEuY9dHRTArUIm0ur5Z7eN3QBTova9lVN/vsNM7Y2VnjI7evstfpEOYJfadD124TFTF1XRNkEb7pcRaFPJhOeeL2FagFfTBMY2VWkzU+8osEubDx3M/EpCY9DUS337GIwqQp+jVOQzZ0FgmM6116nRZVVbO+OeD0ZIyOTlGVvHTyQ146e4nXm3Choe9x7frWkni4DLmBJrSnFC/7eb4s7H5PfPUt01zK8tI4o6pq8qLAtkyZHCukYDVGTspQlGXFyO3x1vgBX7r2ce5PHnEwP6WsS+4MrtKzP9xRywC3Njs8sdPB80yUobh+Y0BRlJimzqDr0u+7/ObPXKGuax6/2md/kjELpWDcO5hgNPesX3xqjc2Bt9TR7+9PMUyDcBYyOZtIYpsjz2VVVjKpKxOl1DLtzt66IpN5qwdee2noUsURy8jghW68zKFGrrkskYl13uzyDRO338XxHaqywnFtNnouPd8izkr8ls1knuD7IhfMswJ30BepdqsrjcNCmraAyMtixb5/X6Z7dXxfHtdZp8yylQyvOex+H9MyUbZDd2Mk0tWWy9mpEAI7nsk33zzjL96d8vL+TBLtLIPd6xvyvRaWvpq+ksTpDQkvCcQqN09kVWB71Gkq558Eq/NI5bWrO00iYA2272GYBo5jMhi4bPRcvv9wzJc/ssXDgxn3T+akecnOyGd360dfxx8sr3v1B3BxAWdnFNMGqslztN0d1EAS7epvfR31xS9Rv/smZBnmwJdfLo2pv/L7lPuH8Nr3YDCAXg/yHPPmHv5un956i27HIstK4rgUzbppLOF1o5GH+c30uyCNLaDzqq5JKrHIrZu9vNk0A5dFQVRV2Jq2TIzrKJG7zecZmqlw9oaoW9fIXn4DNjYgTamKkosHl9Svvc7xy4/QbROj66GUztZeZ+mh32lY9i0lDHejkdxVNVwUVVPYJdv+KCs4zQvxJ9E0PM9gYCgOohTT0NnoO5xOU46PQtJ5gu6Y0G5TXM7FkrfdRRsNwbLw727yzrtTSgTJAGkyFtp5V9casxv5uqfrnOR5YyZUNwRH8di/LAouioIa2fNraMQN6961FI6jsG9sQl2jtXwu/+m/oZhGKM+Sv2dRyLT/IT5ePnuN+9N9yrpilqSUdUVW5mz5awxdj7hI+R9/8M95anSH/eARdV1jK9HOr7lD/qfX/hUPpoe8PX2Hnt3FNRxsZfHvP/ZTdLo+w1EXpevM5xHzeUSa5UsYv90wZTstD8s0UbpOuy0hFNVi72w1BLu0XHm32wpMRTXLyAuJZzUcaymz002dKEpIigJLKZ7beIz700Nc5ZBVGfMs5Cv37vGdo3v85evv8MRoxF6ng650PvHsHWlCXEPMb5S+2odrrJzy0gbCX5jazDJpCppUubbvoTyL2SSg1/ZxLIvJLGA8nnNxNsG2ZY0RNhahPauLqQzWvD4/tXWVe/ceitvfQgWwLBAN099qOEGRsOzD0zloGmmYoLkGqm1DXlJME6owJ100V0p08zKJWSil88LuDpYyMZXJf/fdf8s8i3ANG6Up8rJYomkf5uOdkxkPLiL2Ri3iIMZ3TNbX29zd6zOZp1RVzV+8fclHbwwJk0KSNB2DJEx4680zTucZcVbylw9mPLMrpDpT6Tz7xIbsmzcG2J5NNI+4vAgom0JnWqYU4qpa2tFquobZrKmoGplbmct0DbJfdprUts4IpqdUZSXGOLYrvvXxHMqSPM0xLZNO1+VzL+xx772xKFlCyYP//rff4eRkztnhGf2Bx50ndqCuWNturGz9/qqxACEHWq68X+jbq1IepyvZh4cTMExUqyPBU5ZDOpli2qZo6s+nxIEUXce1SdOCOCtJkoL904DzWYJh6Ny6NeLhW4erQp3FDSGwUQGYjqwQNG25yshOHsljF9n07WFDKjxvpIqNaY7bXk72hilNma5r9DyLy3HMP/pvfk8IillJVlQUZb1s7P6644Ohe12nPDimnAbygxwDTJPy7QdU0zna534ebb0xMnj4EOoafSDFvHrlB8SvvIsa9qiTBB49Eub9YCC/53mA7pq8uj+j27VwHEW/b6PUyqVN1nYii0uqiouiYFyUZLX42280gRcakkEP4ohXIha5aV2xZkpzYGgiawPodYXooLVbVA/3sXaGaDfvwrVrWGsdNp6/wptffYP+mo/ybKqsYPD8VbzbG+xt+c16QCD7oHEbmxRSIEemyO6SquY4K7E0mJbii1/XApW/MY6oahiZBtM4x7QUw5bFZZyTZxIHTBBgbA7lRqxpaC98ivL0kvxsLra9dY2l62QNdyBoVhd1DWlV0VHiGujpOi1dX65AFw2Jrem0lWJoiLzOVzLVdw0lxd7Ssbf7MrHnOdrtO/Q/eQf31haa0snffk+QEPdHw0UfhiMtM6bpnHkWousaQS4v4K8+eJHt9jqf3/sUP3v1BQZOn3kekJQptmGRVwXH0SkPJhMGThdd0zkIjtE1Hd/0CbOYoAm2ePfBEf1+G8syabc9TKVIs5wgTNA1nSwXZmxZVQRRTDlpmPeWErvWrARbYfTcVbGvKuyBRx6ntDwJyDGaa9xQik7Hp2Pb9J0O5/EYz3BoW23SMiMpUn72zh3+9de+S6fj45sWeVXxm0//FM9tXeVjH70rcLehS+FeFvh65VK3iMWdNzrhrCnKRY1pGOy/d4xjmwxGXYIooeU7tDyX4/NL0qwgTXPO4wlrXp+W5WMrm49vPsdZNObexZH8ccp6xeZ31DIqV3OUMOsNWWPopi6SOkCz1FLFoHsmuAZmW2Bn0zBwfAfPsUnSDNs0uLG7wbXuDtM04Ln1p/jF209xo7dLy/I5j8d07Ta77Y2f2PX44x55XnE+TRiHKY7ncHQaMOy7fON7BwRBxu/8zJ5M20pjGmXMk4JWM5n3+j5v7I9xLcU4SHnzRF4LLcfg+EIKS6fjEh48otVr4biWhNkAaZJS5IXssvNC1ihZQVmU1HEIaSxEN6+zlNphN6mGXluQv41dqlwKuqYUynZEWmZZ9Nf7KEOxvu6zfx5imoqb6z7n5xFnJzN2rm9xeXLJYH1Ar+eQJAXDqzvcvDWU77OYnBfpeV5PivvCY17T5ONw/FftaUuxjI7DGHJhz4ezEMM0sD2b+WROOAspywrPNdnouvS7DqOuw6fujCgKUQSUeb5i0S9IgVkkXgIgTYdhyXNhu6vz8jvilx9OVqE47SFYjpj4ODZ2p02RF5i2TPN5XjIOMz5yZ40nvvg5djdaJElBkpWsdx1pvH7E8cGFHqjTgjqv8H7hU1RxDp6HeuwWuudQf+tr0O5Sv/ka7OygDQZon/si6Dpap4375Z9B+w9+B4IA7Ymn0BwHTk7A97E2uyhPCu48yEmSEtPQOY9z4qpa5qxLWpjsnpOGVW42E3tey86+pUTitvgPpOBbms5JQzwDccfT0Dg5janCjOLwnOkPD9B+/e9S/8lX4PAQul3UtT0ezVL8zzyLsbeBfXWdYhZTTGM6Q49b6z62ptFR2jIWdmGoc55X7DQWozXSDGyYioNUzvcoE2Zk1BjWGJrGDw5mRFHBWtsmSUrqsqKezSnPLgU5yTPqP/9TVL8tTUBzCKIhBkLG+9YXC7//uBJyojjnCds+KEuqJvTH1DTiquIiL8gqgfNnDbdhOHAw19rCrRiNqN+4J41aWa401dOpfO5DfJi67NSLquS3n/kMVVWy4Y/4uWvPAfCNw28zdAa8enGPTW8dz3D56e2PY+kmjmHz957+Ob589Yt4hsuN7lV8wycuYnbbm6ytC9xnWAZZmpNlOUrpTOahuAlWZVOgReaUZrlMqKYuzPDLVCxvfdHWF0nWWLyuXrC6bRDGou3NiwLbs8mLgqOjC2ZpyjSZ84PTh/zazV/gKw/+nD97+EPiImW3s8HJ6Zj/6Pmfwbc8Bq5PXKQ8mp0z7LW5fmVTCvzCdGaxr6+b90OnsXxsZHc9W4qvBvk4gqImDBOqqsIwFG89OCTJctYHPYpCVg9vXJxxmUwZul3SMuX7Zz/EMxyOg6CRabFyx1sgGgiLHtcQ58C4oJplMv1XNXWQk84TaoR0Z9omeZySzxNsyyCJUkFVbIt222Ov06Fj+/TsNm9PHtC2fBzDxlYWVVVxGl1wGl7+JC/JH+tI04JHhzPirGS00RX7047Npz6yzea6z//y4jG/9PQaf/76KY6pyIuKT97s4bccTFMnzyv+3gs7tF2TQcvi6loLTdP46K0R7Y4rtsS2z/RiShQmWJZiejGV9VQQUKSZwPd1TVEUVHm+NMGp5w1r3LRl51xIUI1SSiBrAF0ahbooKJu1FmXJ2eEZSZgQxznHJwH/1Zfu8Aff3CdLc27eXsP3LYo45u7dNdyG0KppGkdHc2zXRnn+Ep4XVKi5nqpS3hYe+As43WmJjK2uqGeXjQWuIwiHZRKOZX3h+q6k+OUFp2ch5/MEzzY4HUe89GDMYOBxfh5I2MwiHS+erRCEOJD1woKvkEQyoS/4A0kk5LsFp0ApWWsEY2zPpQgj0iTFtE0c1xbEurmPPDid0+87uLZBt+tQFBWv3L/k8jL+kdfPBxf6uka1bHnxRxHGCx8RQp1loX3yZ8DziP7ZvxAZnu+D61O/+E3wPPTf/S/RTBNe/Q5YFsXXvk6d55TTAI6OMK9tkexf8PztPmVRo+ng+yb9xrfdaabLtJa897iS4rTYLw+aDGmxgtUxNJmqy1oS7+bN5C/7ctld19TCejeEsm5sDmjfXKf+oz+A0YiTP/4h+f0Dgj9+kZ/9h1+GMET7xM9QXMzQHRPdNbE2u4RhjtWsFjqGvmS627qGp4QEtJ+WjR998/VGSz8rqyWH4CTLl+l44+bmOFjzOH40IzuaYHz2M2CaVN/97vJPEkwTkmZNUSPEukUsb1ZVS/lfTS1mQcj93GukgUUjDcwqISh6SnT0C3vgnmHQd0zcjkMVpmjtJkbStmE4hM1NtFu3KCYh5fkY+v2/we3qJ3+YumLgCuPeM1yu9XYI8hClKW52rwHwX3/99xgnMyxdJvnvn71CUZd8dvuzdKwOb0zeQEPn5bPXyKqMy2RKmEc8vbHO+emYu7d2mc0jgiCm1fIYdNvimmdZrPW7xElGnuWQlyhfktvqqIC2yYJoYdmmXMxVLWE4cUkaJlRZSZ1XKF0m2bwQdnlRlHRsm+32Out+i++cvsj13jb3Do65Pxnz1fs/4B/9zt/lIh7z2d1PAGDoCt+yuNnvM59HK1heoJ5m+mlulHUthV3XVi56NZK6FzfQf1UzmQSEUYJtmURxiq7r9HptptOAR2eX3OnfRGmK75+9TpwnpGVGECVM5+HKc39h1uM3BL3GHreKhS+gd6yVBLAhCVZpsfRG0CyF1/fJ8gLLMXEdi47v0h90eDidYuoSeJMUKR2rha5p3OxeZZLOBaGxPtyoFIBtGwwGHkrX+PjTm+R5hY7GsGXx2bsjqrrmv/93bwtHouvimIrv7c8ZDFz+81+8zbDv8n+8eYmmadw/DQiSglmUcTpN6HYdgnlCd2sNpRRpnKJpMNoayYTb69JbH1DXNVmaQZaJWQyIbK4qpYjZPkars/Sc1zQN8pQsFD/3qpDIWNux0XQNZVm0e20c32F3vYVpKv75D47p9x3mkzkPH465vAz5xBee4eBwxi89s47nmbQ7Lp5n0eq2hAy3cL/TlaAKi1WCMkRPH45XbnqwkuIVmQwuQDoP0HQN3bJJZnPiMMa0TLIk4+I84CNXexhKxzQVjw7nbG+0pAnKi8bHP4X+lqwsOkMJ6knTxjFQAoEMvyVe96Yl5xmMm4Cb5jG6whhtStPR8vFaHq7v0m7bWJbBx24NsQydtmOy3fc4vAh5/sYATYNBx8b3zR95/fzfsu610RDNNpYGKeQ5nJ1RP3qI5rhYo7bI7wCCGdXDfZGGvfQ12NimnlzCZCKkLpD3e3ugaZh9ye/O8pLLy5TLywTXVUvy3H6aoaEtp/u4knk9rSSkxtcVI8OQiG5Np2c0MjhDIG29+T5lU9xahqJrKGzH4PxgRnk5RbUbk4FSOAIXrx3z7VfP0O48jvbUs9T338IYtKGsUHvbZGczPM9cZt0XjdwvaRQBlqYxKUr6hs7I1EmqistCkIjHXLuxoS2pGnZ8XIkW+MbIFyvVk5DR0CEfh9Rv3oPBgHwSUYchVRjj+yY9x2ikeqLlbzXe9L5SBGVFWymmRSluebpM7Z0mYtdXipZSYovbEPJKRFLnNnG6s7Qgmsaorg9PfgySBO3GTUFmBgO0dgdz2EY9fkem+g/xcXdwkzV3wJrX5zwZM89CiqokKmLiIsJSJl+6+Rg3e1cwdZOyLojyhPvTfV48e5GhPaCsCsIipGO38A2Pqq7wTQ+l62ztrInznGUyD2POzyZ0OgJdahqcHl1iGGrJKl/a4DZ6duVbmK5M6YZnrYJnOhYoHd0S4l5eFKTzBM+xsUyTfr/Nd998wLuTQzRNX/5eaZrz8OySr3z9JZ4ZPc5To7s8mD3EUSZFVfL02m0O5nN6vdYqRGahZV8Y0yhN8t1dA9pNTO08A0NncLvJvA5litKUhufapFHKxkiMhk7OxvR6LYqi5M3xO4Kz1RWGbjBJxUHv2pVNlk9SWWP6dqOP1yTFzzWW640qyiEu5DFKdvnKNqmKijxMG7RjxbSv65owTgjmEbudDnd7dyjrkhu9K+RVTttsU9QFA7fL44PbSx7Bh/l46mqfnZFP17OYRTlZVhKmBbOkIMgENer3XZ66PsBQGobSODgPKcuakWvz2bsj3jqacTqN0TSNO5sCLe82ITntjku336IsS4q8IM8rbEeIf3VVM9k/QFf6suEqw7kU16psYHgH5fkU4zOMThfGR4LaLJoo0xLmu2GSzppp3FAEU0F3/vTrb3N2OudkElMUFZ1+h9N7b3L66mvc3Orw7GPrfPW1c05OAuFLrfskUUI2bxQTreEqLGdhzLOA6kGg/fcF3/g7u/L1YCwIhKZLAmSW0t9ax2gMqNyWGOT8b19/QNRIOMMw4+g0IAoiems9UZfZnhT7qoYskwYji1ZrjKqkiERjL8+DtdLSNwgIlnhqVJXYFpdlSTgLiaKcNC34zLUeStdY6zqczRJeuDVq6C0ad7c6jMc/7kQPYr1ZVmh3Hqd++FAKvWnCbAZpQno0WXrh1/v76I/dEZLe1/4Ezk6WsXr5ZQhBgP7Jn6Z89R5sbuI8cVXY7Js+nbZJr2ezvuZKmAoytU4KIcvUdU1UStG0dY2+YeApKaRZXTFpvPAXsjcQln0F+E32fNIYzFRVzeaTm9RlTXY6Q7t5G1ot1q/1ODgOeeGJEfX+A9jchkeP0O4+hmYqykeyW3RdxZpjLUmBLaXjKa0J+lqY90iT0VOKTUvx61cHHGY5vSbxLq9r7qcZs0JCc6azjLqCyTQjTUtaz99cyhHNxplO77axdwf4vpjm+LrWoB3yZjTDV95M/IYmdsGblikcAWRqj6uKWVlS1IJwRKUEAC3IfZoG/sgXlCaeoz3zHPXpiZDv3ngDvBbmpz8h18CHfEeflAl5lRMXKbd7N0mLnKIqGDp9kjJl3RsySwPSUoh6h+EJt/s3CPOY//Xen3EQHmLqJoZmME3mHEXH3Opd4zKZ8IUrL/DM5gZRmOC5Nq5t0e216PXa2JYpRDpbSTLdAiZPSimSAKaSF3QlgUJFkkFayvopkX1oFeZUWUm37YOhUzRs8rIs+cyTt8nLkpNgxt3+TVqWx629TebziM9/6lnKumC3tcM7k32eGt3FN11ev7hPWkpR3Fjr00QsNo5i+ioMBlYEPUdB3+b6c9e4PLgE32hkbzX1OGUexnR6LaIkJSsK5kHEeDznib0tgcjrioHbxTZM2pbHzeFAYF0Q5CArKcoCykZ6aOoiNVxI/7IKc71FWQk50HRtyrwUEmNdU4U5eZJhNgx7pRSmaeC3XCylCPKQltmiqitM3eQkOqNttnl8cIvj6ISyLn/Sl+Xf+LgMUjzb4Gwa8+R2mzAULX3bNjgPcoImmW4e5wRJwcU85dZ2hyDI+MdfeYvLqODpK1IozicxLz0Ys9FzeeXhmOubHXo9h6P3TnF9F8MwME2RlekN6RjHJ1kQbzVddsvxDFp9dNPEcGyRqrX7En5j+5RxvIpwLculcyP1SgJZJzGGoegNO8RhzGPbHU5OAjpdFy722Xz6KaZhRt+3eP2tCz7+zBZXdzv88PVT0ZYPerT2rq787E27sd9t9OrxTOD6ha7fdDDWdwjPLuSxhiUqgHBCMR1jt1qEsxDTNglnIWmcsr3TYzQSPbyua/i+heeZeC2POGjsbKOpJPHZtpDRDWPlDJgncg55gtEdiDrBtFYBNnHDvI9mFFmGZVvNSszAMCWlzrYNsrJiHGb4tkFdw6NxzDwp+PlnNjiaJLjujzvRT6eUB8eotT71m69LWE1dQ1GAZXH/n/zveE/sSQcD4HloO1dgexvtiSeo44j66BCiCOsjj1G+84B6/wHFNEazHaLvvcNkmjGZpui6xstvjWVXr2uNn7tA9hqS814h+vSFbO4iL/CVxKr6us55XpLXdaN3Fyb+Yg+d10KIUyAJTnGGZkgTQxSQffsV7N2BsOW/9AL1/fvU3/hT0nePqM/P0EcDynlCnZWcniWsjRwcXTTziyS9hVHNosAu9PZRWfHNo+kyfGbLsrjm2uxYJqamcZrlOI5i71qXK3stPE9Y9wwGQmJsGMnpWwfkFwHjSbpUIy1MexYrDRAEw2zORz4W5EEHzotiqWbIGt5D0DQKXmOr++TNHu6dbYhjtLVd6pdfFOjetsHzqN+7D64vK5sFhPchPQ6DU06iC5SmeG++T1EVGLpBkId4hsc//uq/4lpvB1M3ScuUnt1hw13nqdEdPr59i6ouuUzHpGXKTnuDh9NDToaIXQAAIABJREFUTqNz/tUbL5KVGW9eXHA+npFlOZ22xyuvvstsFuI5kvtNXKyIbGWzl/ZWKXJFmkv86zzD8mRirSKR2ylTLZ3eksb5Lc8LiqJkPJ5zEoZ0bIekFITie8dvcWcwJAxjfvvpn+Pls3t8++RFojzhKDxl5PaJ8hTfNDk4Omdra7jUqm9cXVtOy8Ql2Lqc9+JzScn9Vx6Km52lo0Ye/WEHNXAxlGI2npNlOVf3Nrh7a49+v03PcejZHYIsxjc9yqpif3bOcRAwmQQr+1tNE+97xABHX5joLFz8msdVswxDKZnim/UbjWEOaYmu69iWQZxk3LqxwyevX2XLH7Dj77A/PyTIA5IipWV5vDt7gKmbXMZTmTw/5MfpNGEcpCRJwXfuj9nY8DENnaSosAydF7/5NptDn8t5itI15nGOqXSu7HZ57uaQqq754f6Ey2nC5sDj7XcveeNgyptvnPLeWcDDBxcyvdc1mq7x6N1jJuMQZagl9L7ce5el7J1bQ6ikgBdhJPLG2cXSRRWQYtb8rzKUNA7Nrr+qKsx2myRORas+C3n3NOTRS99nfBmA6fDME+tcBimv7U9wXZO3DyZs9Fw6HZHkRefnBI/2l5a4em9tpdVPgiYKNmj4A4I8Fxcn4khne6hOn+6tOxiDDXB80tmcsigZbvTx2mLP6/sWHd/CNhWGLmuzJCkkx0LpMr2DaOebFSx1DXkqhEWzQY31xpyqkiaZ6enKmnfhcR/NltdjkRd4bQ/HMXjjtSN6tsWj4znfefMMx1J4tsH+WcDQN3jp9RPcRTrlX3N8cKGfz1HXr6DduIE2WkN3TKrxVArQZML6Tgftl34d7foNuLykPj2lfvRQdNf37kEUgePI23TakHsq7Fs71FmKe3uDW//x53BsxWDgcGXkNR2TwdAyWTdNOoZM95au4epak0onXvZFQx5rK9nlb5gCYS+oTHldN+hAQ/DVZWe+se5irrUpLgJJ10sS8ouA6Q8eMS1KgahdF21jA3PYEg+AOMZca6NaNtdudHl0FDbBMBrtRief1bKnn5cVI1NxmJUkVYXfxMqKQU5FWJaM82JJKFy3TNK0JJwljS64ojo6kVVJu02VFWi2hWYZROOIpBS431eKNdNcFvIFkpHW9VLKt9jRF7WQ9gaGokZS/6KyamR1MC9KOo7BWiN3RNOogoj6nR9SHx6hPfsxKfS+T3nvbaq//AZkGdpPf/7/2Z3qb+k4Cs7Y8te40b3CuruGoRvMs5Cu1SUqIvqDDh9de4ahM2CaTQnykJP4lKzMOAhOmWRTlCaTd14VdOwWhq642e/Tttpc6Xb5L371l3A9B2UoNtf6WJaQhtotF+VbqK7TONEh07ISyFsS2zTCeQy+SRakYOkCWy9ulEWF59hkeU5RlrR8F8s06PfbDByH0zDgSkfIbqdhyLcOHjELIoGoLY9rnT0GbhdLGbwz2adn+5zMA+7eucLb7x6Itt9WrK83XIuyId8FubjwzRv0YRFtqzRISspcmo1ymlCEKX7Ho9dpEQYx3V4Lw1Dsz2YcBqfYhsksDfAtl67tMI8T0jRbpdG1/i/m3ixIr/w87/udff32rfcGGvtgMJjh7Bs5IjkSKY5MS7IqUUpVipJy4ji5SFK5SHKVu1SSmyTlShzHjp3YKqkcFSlLpEhKo5G4zZDDwWzArAC60Y3ev307+5KL/0HDTokTW46pOVVTU2h0A90fznfe//u+z/N7BIcePwFZErv5e+uEgp4XJ4k4GI187Ip9ctAVfn+B7J15PrVKibWlFr4fMosiLM2kF3Tp+2NOldaLrAOfv//Gn/Hm8Q0kSeLRztWf4R35l7u2i7CXBzcaXF4RKWW9kY+uynQnAZ2VFqfaLsc9j1kQ40cJpiZTsjTGXkx3GmFqCp26LQArozlHRzMuXerw8Ok6tbrL40+dIc9zdFOnVCtRqYpCV+k0UcsVIS4DgXrVTXHYVxRRzHSdJAjvF1rTEd2tPytEeeIgliYpkqaLTtXQsV0bx7XY3+nSXmmjqzIEM4GWDWa4loYqS5xZLFMuG5xbqfLOzR694wmKquC0W0LUVljostlYdPH/vO7Crgg7370RuSSLw8p8ROrNGe8fkYwLQaauY1gGs4knxuZTjzhOidOsWInIXDrbJMty8XsnqzgHDBupsJYD921/98KBEMVb7OhTsKvCcpilYuwfRyAr+OMJTtkRRd7SadZtFlfq9IMQTZO5uFYjSTNsXeHjW32+du0Q3094/vLCT71/PrHQp+MZUmeB/M4d8s3bZF5EFiRiZKtp2L/4LIwHRK/8gHw4Eorww0MktyQ4980WUkXcHN61W+LNmSSk3QH0ekjtNtKFy4zGEYNBgOfFWI6GpspM4/v+1kmSnSR4GoUVLM85yaFPiyJ3b5TuZVkRkKNgFaN0XZaYFyI930+IDkYYD27gb3WRbIc0iDHKJqdrFtf+0Q9JDgegqMin1kj2uiS9CZJtIymi29AUIQAEGBeku5oqM0xEmt1xlJ7Q+BRJ2PDmWcZRlLIZxBiyzG6YsB3G9OKEkquJZLriAZZMfNKPbkG5jLLQgsVFFNdA02S8TAhxolx04jkIrK8s0Y/Twk4nphq6LDPL0hMq4DARUw+98P6Pi3Q/P8sYBcJ/rOsKtFrI587AoIu0tAj3oioPD5EtHanTRnr0cfjo+ic+oP6qr67nsV5a59A7ZhJNGIdThsGUIBXK7d+8+jxZnvH64TtMohmzyONwfkRJL5HnOYv2Io4m9mwf9AUTXVd0ZEmi6/do2i6PLTxMv8izjmNxorctgyiKxcMtTk+wsqgyum2gKmoxlk5Pdt0oYl+uyIoY3cP9DHnEy+/5IWmWiZS4JOFMrcN73WNs1SIDOq5Lu1nlv/nm7zAJZ9SMGhdqG2yN95lHHlWzRKfk0iy72KYh1NayxPHxUBTWe7n0WhEVm2T331gSYlcfpHDkUS47ovufxswnHooik6YpcZyIsJQ45vZIrLtKukNZF0FCaZpxdDwUQqqCb68qCnJZR1OFyE8u6QWpT70feANga3hj774jQJXEKsRUyOYxo8mMOElxHIslt8qp8gp+4rPgNBlHY2zNZHdyxEazzqnyMk90PsMkmvybvAX/f7l0XeHx0zVmfszYT0iSnCBIOBoHxEnGgxdabNQN1pfLeKHYsXenIV6Y8O6tHq2SzkLNxjJUtu+OqTVLOI5OXLyurquzULWIw5gszURoS5qjGxqBJyhwgBCNKepJsTYsA2T1xP+dZdn9jhXux8hqOrIikxeHuziMCf2QKIw42u2yttGhu9fFNlSU5XM0OlWoLfHnP9ziuOeR5zkPna6zdThF1wWRTtVUQq9wIc0GYkSuF4E2SpGgl0QwOhTxsPcAOnkmPjfywZ8IiE3kw7gr1PBwYlUzbIPd3TH7B1OSNKdkaTTLBq4rEuWS+UxMNWUxxVAKxK+saZBngnQnyeJzDAvCAhqkqpAl4vW45wgocuiRBJJYkoW13A8TfuGxZT7u+TRrNlGc0h/5XN8ekuc5h8dzfvmFDY4m9x1Z/+/rEwu94loii9h1ife6aIt1kUU+mRB+fBemU/L9u2L38sQTgvaVpuT9HtRq5PMZeb+Pf30L+9KSeGM6DuoLL0ClgvT4M+Q/eQ1VkRgMQypVA7Vq4/sJrqqgIuHKCg1NYUnXqKoKQZ4xTlLuhskJHOfeeHycZCcIWICqqtBPkuL5mVNTFQxZwvcTcShRhEgq73UpP36GzIs499c/w6O/8STal75Ivr+H9903kVQZSVd47R//COOJBwE4e7lJU9NoqCqXVkvM0uzkxQyKHX1JkfGynOMowZYlDqKUlqbQ1hU+8gIcRWZJV6moCps9j95QsKeNkolWc1BcC+nBR0j3jpBaHchyDg48gUUlR0WM4LPiMDEoDhxK0cnfEzVKBVkwzLICbyucCtnJ9yuKfdVU6Sy6lB5eE9MYzyPf3QHfJ7/+ptAMKArS8pLY1w/7gpHwKb4uNZY48g5ZsNvsz45ZrywSpzFhGvIP3/kuaZ4yi2fUzDJnKxv4Scgs9hiHYxRJZhQOmURT3jn+kCW3RZwlOKrNVzaex1Ztnlt+jB8fvEm16jIYz2i1qlSqJfwgwrFMFEVYwJAl9JqFpAlffeyHongZCsRCWU+SQZyRBBFKSbCsHcdkPg8K4I6CYxsoskwYRKRZRtupk+Y5O9M9nlw6Td/3+Zuf+zx/83Of56WNn6cf9Hnj6AZ5nuHqDv/DH/whP7d+GYDPXD1HpexQapZYWWmJopnlokvOi87eVEVXf08keM8tUNaZ7A7F+LxmoJo6e4d9uoMJsixTKtu0HIcsz3m8/Sibo11czeXuZMDRQR8USXTxxZQgiRMxzfL/xYeVXAS4SJqMZKviAKQKUZhc0u+H8YQZkq3i2hara22eOL2GIit80L/FzfFt4izmo+EmhqKTZAlV0yTOYnZmd3E152d6T/5lrqWlkqC52jpbx1POrlaIY/G+/cFrW8Rpxv4kYvdwylLdxrE0ojjjzEIJz4vZHwV4YcKtuyOiSOzHTy2VWWu5HE8CJAk+uDMkyzLG/THlWolGwyIMohNgCyAKaLkp8uvjRBTaLDnZw9/rlIkC8tkYyXYFYMe2icMY3dTJQ2EbQwJZEV1+HKcnUaxrZ5eYTXwe/MIzPP34Gi8+uowXJrx1q8dw6KMoMr2bt/ilnztHMp/RWulAcxUMh8bKwn0F/uRYFFCnJmx29zps3RJ8fNOFcE7UE0x8DBsiH28yE8E2gGkZtNsuuq7w+Yt19npzYdE+mgmULsWhQFaQFGEhlBWZrHguyrIsCvu9K8+KcB2/EN9l962BmlGIBaWimzdYXi4Thgmv3+7z2396G1mCNz845uhozmDgY9sizW7zeIZr/mVH97YNgwHp1k4RVZphrLfBMDC/+KzYzR8coJgaeb+H8pSw8TAeiz1+vw95jv2LnxUxh82yEPcVgRXZH34dFIXlz6xQrxksfe4C070xtq0VClrxqa6sFGE1adGpSiwZ6smIPis61LIqlPBaYWe7Z6vzM4Gf7SciozqMMrSGS3xrB7ViIXUWoFoVYS2rqxy/coPx//o7pNt7xNMApV1H6TSpmirRWx9grjawNtoME0G7e21ryDnTIAP+va9eKTj3wmrX1BTKqsxWkJAV4sCnl6vc9BPGScYgSbnph2iS2LM7y1WCsY9/+5joYEj+yndIRh75228S7g4JQhG8qxSvgyFLBMXPX1JkjKJTj4rxvVN0+nGeU1KUk9fMlmXamkpbEwp+VZIIwhStUxYQHEWBep35y28IxX2vh9RZYHZtE0YjUFUxptre/pd6UP1VXYaq0/UH3B5tk5GRZCkrpQ5BEvJfPPkrWKpJzxcio67f5XLjHHmeczDvslRqszs7JM5inln6TMHJt8jJaVoNBuGQf/DutygbDhvri5Qdixcun8f3AkxDQ1FkoRNRFYyaTTQRGo+T7lSRxO7+Hgo2B0o6ki7WK0maoqoqpBmhJxLYRoMpmqbSHQhu+/b4AFtVaVp1HM1GAtp2nW/fvsE/vPF7fDi4zfvdI5pWjYYlpmtvHNykZdtcbrWYznymc59rr3/Iw4+cB1nit37rS0KApxX2u3qhdvcTcRAIUh55+pLopLMcvISkO0dRZJIkoeraTMZztvtD3j865uu3/ogwjfh4eJuj+Zw4EfAfRZaRVJEyB6Brqgj/cXQx1bjXFRYC1jxIRXEvXj+J4gFhCw6B6H4i8hzqpoutmiy4Tb5z++0TdX3LbPL20SEVw+bOeI9Fe5Hd2f7P7H78y155Dte2R9y400cu3AUriyXmYcLzz2wQxCndWYwsS5RNlTMLJfrTgO3ujNFgxs7RlJ2jKVfONDi7UceyNG5uD3EKjrvvJ6wulGgvNSjXy6ytVej1vPsEPAShTXMdiALBnM+yYsxaKMtlqQicscR42imLry2U4WkUEXk+ZrVMMJ1j2iZJlNBcapLnOW7FJUkzFEVmcDRAVSU+2hryO9/+kHc+7NLreVzcqKOqMowOeetWl8Xzp2m2CjiNptPfOYByCySJx371K/eJeVmRomc497GzeQ7NNdHtJ5H4eBKh6jrhbMbGxVUCL6TX85hOQ/7+n2yyvz9hbzBn0BVTINmyRRevauiGjqqryLKM6tgoparY4duCwKdoGpj2/bVHJqZ8WrUuPpYmhfUvIg5jGg2b6TRkNApYrNsc7nZ5cKVC93DEzz+zzq33dzBNlThOmQYxP37/6KfeP//fZLydPZTVJTI/EjtaTSPZ75H3++SDIcnd4g+fTsUpRVUhz0k2d4k395BWVsXvVyrCQ6/r5Ls7SKunCsFCjHZ6mZVnN7j+tXdI07xQZ4rM+SyHwzimoihFPGxGjlDgR4UAzSg6+PtrUIlenDAr4C9BljFPMyqKoOSlec5kdyS+xjZEEp8ksXN3iveN79H+8qPYp1sonUbhBY3B8yhXDJKxz+TmER9+6wOurFf47EsP8MWri/jFCmH00SG2IjNKMmZpzk0/ZpbmPOwa6LKw2PUHAY+XDNZNlQVNFQx6xNTh6KMuSZJhnWmjdypkMw/jVJssiDi4OyYuLHU54j3Wj8VkY5qKQ5BVRNLGeU5VVZlnGZNUjPoFwEfGKEJvDFnQAu/hgSsVA71ThbU1GI+Jf/wWzvNXyN95W3hCNR334XWQJNK7+6KbX/jpe6FPw9WwqtydHrHottBloUqtmiV6/pA3Dt/BTwK2xrvEWczd6QFtS9jHdEXleD5gEs4wFZM4SzAUgzRPSfOMrckdHm09QsOy8JOQF9bP80tPPczXf3DthJinKAqWqRPFCeHERy+b4qB7r9D/c108ajE2lwQwxjJ0snGI54fia+IMzw+p1EokSUKaZtw+7DKPA8qGAGpUDJe9/R7/+5uv8OLGJdpOlUW3RZxlJJmwDbbbVSZhyK3+gN/97uuc21jm13/xef76S88KPnyW8+FRVxT5NBfF/ciDMENbrYCp0jjdYjKeQ8eGsobRdMBWCYOIKE7Z2jkUqv5KiS+eOY+h6iy6LcIkYmv3iLkvup17PvjIj0CWiIIYCWHXu6fCr5ZdSDLyKBXEPEC2VCRVIvVjjIolCo6h4FgG9YrLhXaTK60LGKrGD3ff44nlDf5051WiLEGVNZ5cXqPt1JnFPqNwSMUo/yxvyb/Utd5yuXswZX2hfDLBK1kaYZyyuTsmz2HzcEK5bPDHP9nlYttCVWQsXQTUbN3ucWapwv7Ao1U2cV0dSZI4HPn88tU2uq5w2J9TqZg88+Rp3n1rhzBMUFVZUAd1TYRpBSGKUyoEd8WuOwrEr5NYFCzNKD4eoRkaiSeCa1AUcUiYBzjVMsE8QJIlxv0xw76w2Q3nEc2mDbMhNz84oNm0uXSpw+pyGW8eEMQpuiojrz3AzQ8PmE/mfPD2FlalzBe//DDnHjknUutCj+07A9HNg/hex13R7btFxK5bE9OH6qL4vNoiqLrY16satz/YIfAEa391tYLjaBiGynASMBlMBA5YEtMo0pRwNBRQoDw/sXqmSQppjFtxBWM/ikASzwZZN5BVldjz0B1H0PTyHKteQzM0LEuj03JYXCxx42afZ587z29/52PyPKdd0rj8yGmqVZPJJODWrT7JvefKX3B9cqGXJOF7n05RWnWmP3xf5LQfjWEyIe7PQJZQljtk+4cEL/8Q6cGHAMiTFK1TJX/zGvl4TPT+piDr5bmw4n30vmBXbx6Q9wfceuUmtZpBZcGltVRC12UxlZMkxknGNE0LMI70LxQ7o+hky0UErYTYmVuyjKPIKEgnKW+6Ip34x+fzmHCnj/rwZfHiyzJ+lgsanGUJdkAUCQWkZYFtU1tw8Uc+5XMd1s7UqF1e4vvfeJ/vvnvIginU57NZzCgRboEoy2lpCqYs0dZUlnWVD/2QG3OfXpxSKqJ3QeTEL9VM1n7+MlGccfzjLfIsQ7YM4u6YZBqwsFQ6SZjLyAuevujSVYQwUEIiLch3fir8+7YsYxZdvVJ0+VVFxPfqsowuyZQNlWbDJJ350OtBtYr+lS8hnb0geArr62K/BOTTGXFvKkb7+5/ubkggYQUGt2y4vLp7kyiN2R4fk+YZo3CKJElossbm6Ig/2PwTnl58HFVWUWWFRbfF6wfXmcZTbg63OZz3yfOcMA159eB1LjZWyPOcvWmPb71xnWrVpd4oU29WUAuKl4iTzIkmwclOHrhPpdMVDNs42YNKqsTcD8HRsC1DjDk1MR1QVYUwEvv76dRjezzmSnudcTQlJ2c69VivVVktLaIUkJjx3KduVSgbDovtOsezOWcbdc6eW+HCqSV+91vf5/e/9ar4nhSJMCj28Pd0BQUXv1EtgavS3+1z+4MdGEWgKYQTH4IUSZHoNKt8+ZHLxHHCux/eYRb7JFlKkIT0gzHn1hbxg0ggtSWZPMnFyq/4uWVZFsl4AJqC54dopmANSJIA5eiahqHruDVHZAYAiixTq5RoNCvYmsbt0TZRmvDrF7/ERmWNh1rn6dh1/MQrHBai0zvyu3w83PoZ3In/epcfJYxGPooksVC1uLk7IkoyRlNBlXvv+h4Ai1Wb3TvH/OPvbfPi5RZ1V+SYX7jU4VrR8e10RVHN85xZEPP3/vwO1YqJqgry2ssvv4dpm1iWhq4rxa5YFC98wcFH0QrSW6FwT1Mky0E1dDTLFN2zrp/E15qOyLzHsEAC3dTFgbWIerUckycfESP6iqODN6LWEhOoKE4ZjAKyNENVZFaaDrV2jTzLWV4TE4iltRYv/9Fb3PzJDayNByDPCP3wvtc/FtoCdEugc0tN2HpL+OzvMfrDgo9vuTglh8eeOU/oh9zdOmQ8Drmz1cdxdEGgiwKCyfR+gl2eoTglFEUId/NMiBqzJEHSdEI/RDUN0HQURSFNU5EjYJsYriu0DaqGVqnilB2qNYf9/Ql3dkYoisQXH13GjxJKJZO1061C156ztTVkba3KZDCh1/3pSaKfXOiDgDwRe5fwo7uUnroASYJ9aZn0eIDiGoKc5/sne4b8jddhdRXtynmkB6+IEW9nAeNv/y0x/u/3hZo8E+EVxnIdybGpVnQOjn2isQ9ZTm8akQOje7hEhCZoWlDyvDTHKuxtkzQtoBz37GYyZVU5sbjJRbHLc5G9DjD3EtS6I7r17W0IQ9Y6Ntr6IkwmKBWX3A+o/o//nVhDJAnT7hynUyL1BKRDaVbRCzVluaTxSMM9Ebd0dAVbkYrEOthYKXEUp1yxTWZpRk0VgrlTpoGMsMSd+cpDvPW716iUdVqPrpEFsXgtV9roZ1aYjYNi2pvjyiKOFwTnX5LgkbM1Hr3UwJAE7c7PMlxFuBYudEooEkyKSUiUi3VGqUi00zQJrWwK4V+SCBfF7Y+J//TPxComisjvbBHs9EjnEdavvkTy4SZStfqv8rz6mV9BEmKqOmme8s7xTR5bPIWrOZyqdhgGc2TkonBHmKrKNPT5wf6PWXWX2aiu0LKanK4uockav7D+AlmW4iUecZYQJOHJA3C11Ka9UOf2nQO8eUCapHQHAuARRnExalLEnjlKT2xhckGDC71QYF1VGU1V0TUV0zKYzoUQSdJkVFVBUxSSYj8/Gt2PV+16Q/r+iKWlJk8snaXnCxXxPPb5P176r5lGnnAc+AHrtSrzOCZNM55a3qDk2CBJVGslPvuFR4miWIzqrQLHa4r/nzm7DOOYSw+fEap8R4U4pbFYFweRKOVXnnuUv/eNP6PeKPP45TPsTae07BorpQ7r5SWOJ+J7jsO46PKKe9ixyNOcJx+/xMbaArZp4LgmURzjFNjfc6eXQVMIejOSJCWMEjRVwSnbwpmjibHpqcoCSZYyi+Z8OLzJtaP36PsCbbo7O+DV3U3mkc/nVp5kd3KEdc8e9Sm+/CjFcXTmYcLW4ZRL6zXmgTgQybKEaZvIssRuf86Fy8tomsyPN0e8tzPk8gMLWLpKu+2w3HD4lceWmM0i0kJ3Ua+I3bhtajTrNnbJpr9zwGwaoGkKwTxAlmViPziJg1W1ImM+igoBnCpIcRNhV5QUIZbLi5S3cU+8F2RVFWS8YqR/7/2TxCmOoWJqCrMgRl0+h23rdLvzAswa8T/9rafojXyiOKN/2KdcLxPHIuL16sWWeKEUDafsgG4VKY2yKPD3MLihR61dK9Twxn1oDWDW66KzjwKef+4s7755hzSOWVhpsXe3z/kLHc6tVdlYr0GeoTsO0Wh4f1oBQrsQhtQ7ddIkRVZV7JJNHMWYtgkSNBYaAIR+KCA9WSYcBCWHJE7w5z6yLGNZGpqmMB6HbPfmfPj+IbIsgD1//OY+cZxxuHmXZy+2CP2QWv2nJ4l+YqGP93qCux5GGFfPkQ7Ggn2uKORpTjLykB9/EsplMi/CWGlAuUz43deRzl4gf/89pEceEye/OzeFiMs0hdBrPD4JRAlv7dP+a09h6Qp63WHY91iomWS5SFWrqXJhmc2xZFE8Txcd9D3hnSnLuPdCPySYpdlJF+sUwrNZmmLIElfO1bAsVXABDIN06pMPxT+Y9MjjwkJoWQze3CH/8Ssc/+l10u4Q21aZH02RVAX72QcFzKbQEiRpThCkeF6CKUsYkoSX5vTilFGSUVoUI7e11RIXbIMUeHi1wlYQcqlm88LFJt6NHeZZhnt1jbe+8QHh7gCGQ+KdI6bffZe5lyAXpMAwz4okv/xkVA9gnW4VVNX7Qrwnz9Rpn21gy0qxsxfxtbokUdNUVixDrEpWG8VTpbCHhCHqUotkv0e6tUN0a5e4PyOZeCTf/S7ai58n2Tn8pFvor/w6nPUJkohRMOPRhYtEaYKj2bRtEa50d3rMqcoyi06LKE15cukylmrww/1rrJfW6Po9Fp0OaZ7SC/o8sXgVVVaJ0wRFFvjUIIn4vXff4LlTp2hUS9iOSRQltOoVwjAWSvLCO5/7iSicqozasIsHHaiGRtl81bM5AAAgAElEQVS1T/zkaZoR+KEY4ecZjiW8uP3RFMMxuXhuFVmW6E5nNK0ao2BK3x+TJCkXa+eZhHNWSh1e3b3JW923+J1rr3E87+OYBh8fHGOqKr908SGSLD1JxZvPfKIoxvfC+7jZHCHAC1MeWlwAGR44vYy+IvzRl65s0N/q0llr8fSzD3E8n5NmOc+tneZrL/+Iw9mMeeTzfu82v//xmwz6hW9dkU5G91kmMu0lRcI1DRqtKmmWYxg6EhJxnPDgxVNsrC9imwZazUbXVGI/xDR1XNtisVUnjGKWGlVszcSLA5ZKbbI8x1INJtGcMI3oegN6wwlpnnHt6F1e2niRzcIZ8Gm+NndGNJs2cZrx3KUWH++OOOrOWV8oo2kK0+GUumvQqohOfLVdYhYkggmvyRiawulOiYqpcjCJePxSB9+PxYpdlbF0hcks5ObtPhtn29j1KlmWEQQJlmsJ7rplQqmGaugis13TQVEwXBfVEKp6o17Hcix0Q0eWZSFMSzMs1yKORLANwGw8Q1EVGgsN0iQlTVMars6ND44wislWo2HR6bh0ahbD/oyXbw7Zut1j4kfUWjWRiCfB6qkWu925+P7SWOCmDZtRbySKu6IV/v8EJIlWuwTTHpWrT4sDgKrTPnuaoHuM02xw7tFLDGchYfeIpY1l9reP6CzWqLkGN272eOONbWTbLZgARdiSJiA3USgscoPjAaYjwpYURSFLBSSo1qrRaDoYloGqqSiqQhzGwm5o6lSbVTRdo9NxWOy4xHGKokj0pyGlik0YJhzvD5jNIiaTAMUw+fZP9vgv/4PnufPx3k+9fz6x0GtPPULqRUiVMrPvvoNSdohfvUY6mqI4OpIiC1W2riNbGv7tQ5LdI+LjCfie2C3WGuTTKfn+HvnePlKtRnQ0Ig9D8tmcPE6YH4zBNNm42EAu9h4LK2VSxOjekGUMSRQoQ5apqgrzLD2JjK0oghQ0ToV1LM3vx7WWFCHk87JU+O+zHOfyMuNxiLnWRNKFP33y1jZLz2yQf+8VYaGbTCifaRG8/KoYGcYp3V5A47kLfPDD7RPNgW0oAuCliBHtJBZ422maEeY5qgQXLYPMj1kyVNI0YzuImCQZC7/xRcqKwoWXrqBaOvOBR8PSyPyIh17YwH5gFenFl9BfeAbFNvC8hPNrpZMduy7dhweVFIX+IOQHf/ShcEEVYTYZOaWzbV75wRaTNOWei+uet1+WJSxLoVzSSaeBUNafPk0eJyTbB8Q7h0iqjPqFL6CfXyMMU8zPXCTuTcmvvY76CafIT8PVduroskLZcPj+3eskWcLXPv4e+7MuZd3C1Uy63gBDMaibDm8cfiCKgScSG6eRx+nyKSzF5NZoi1ujO5iKSd8fY6oGURrT9cakaYqh6Fy+dIrTjTqqqtBsVZFlEUYjF4x22dbQLINqvUSWZ5RdB8swsEydOE5EEE6akWYZhinGfKauEycpYRCd/HnPXThDkqSs1qrIktBlfO/OJl+6/AAfDW9hayY3hzs81F7hg/5tZFkmyhJ6gwlfuXyF127cpGM3GAYTkXkviXshz3OO+yOhGwhSIQSRJIy2y53xGLlp0fd9ou4cZgm/+exz4Gr85hefpVV22TzqoWsq49Dj33/pBZ5YWub55af4ubWnATg4GnDlgdPomkoWZ+iaiiLLZHmGrml0B2PefucmSZIymXqC+TALuLKxyh//+TW8ICTNUsEpqLrEcSK6H1PHdS2Gvo+fhJyvr6NICnuzY6GzUHWeWniMU5VlAM7V1gjSmFfufp+2/emeSgGoqsxsJtYN371xdCKQ2zmaUqkYWK7F+7f69CYB29sjjkY+WZ5jmiqaIuOaKj93Vuyrt/seu/05rZbNYBSQZjlJmhOGQo0/m0WYjoksy4yHMypV+2QcLch2QJqiGCalRvWkI600Kide+SzLBEgnTlA1QdgzbfPkY3EUk+c5rZZLHMU0miWSLKfTKXHt2g6PPn0WXVVYqFlMfaHWf3ezR+AFDCYhw4NjrlzusH3rgMW2K1YRmaDhqZpKpV0nmwyE0O4e914zoNpmNArAnwqVvzeGYMbnnlwDVecXvngRw1Dp932USoNhdyTG84rEly83eeyBDtWGyH03bbGiyAq7q6ZrJyN70zaZDMRKIPAC8jzHn85pL1T4+MaOWCsgRIq6qZNnAlSkmzq2a538Wywvl9F1hY8+OCRJMup1i6efOcP6aoUojPjCi5cJgpi/+/XrnLu8+lPvn08e3Y9GqGULLAtrowWVCnF/Rn4vPxrId/cgCEi9CMXSSWcB9sVFsh+9Bo5D8vtfEz/Q+QeQTIP84AC9UyUZeUjtFigy9S9chcEA59e+hKyrVGom/iRguWEJC23R2ZcUhQVdY5ZmOLKA1ISZ2FXfy95wFQW34LpPi8JfVRS8VGBfdVnih1+/jixL+FvHwg6YZkwmMek0wN86pv/GHeY3RDb5YHvIzbtT0lmApsqE28esLLuQZWx+f5MgSoWgzVbZnPtYikxHF3jejqbwaw8t0LZ0XnvnkFNFV3bBNoVDwLLIgeHrt5FUmf2DOVvTgCxMmG52YXGR4H/+O+S3bzHrz3nwqw9iOjprVZMkhyi/L764sFRiYaXMOE1OxvJVVeHx507xp9/5CEuWCbOMKBOwnDwXE5I8LwLLspxk7EGWkb59XSTUyRKKYxDujwi//g3SvSN8P+XmP/keWrNEHsViBfMpvjRZJc0zxuGMtl3iYmODvemUOI0xVDEV2p4ccmt0h54/Qy7siqcqLf7w9ss0rArf3HqZjJwH6hdQZZVX999EV1TmkU/FcJmEIS89eIWN6gr/8We+gqmqdJpVZjOfxXYdTVXJ/ERAjiyTcslmNJyhqSppkXsQhDF+GCFLMrIs41gGqiLjF8Xd0DXwE6I4Rtc0/q/v/IA0z9nuD0myhEkYEoUxiixzZ7zH93Y+outNKek217sH9Loj5pFHkqRcP95jebVNkqV86+33GE3mkINu6Pz4tRs8cPEU9kJZCPJkiS/9yrMstet899V3uXBmlSRJcRbLYCmcqa5DkPLD7W2WSiXeuXGbSW9CzXR4e++Ac7V1/tF7X+ft4/eYTj3+s69+CUVRWGzXIUqJin18nucsdxq0G1WSNC3gQCaqovD881f5vT9+TWgdppFg3BcIVVkW3v0oFiu+NM1QZYWuNxRsgSxFlRX2pj2+ufkyd8Z7dLsj/ttvf40Fp4GtmZSNT7+9rlaziKKUw+4cw1C5utHA90WxdC0Nw9SZTgMODqaMBxOyLOfu7phy2eBo6BMlGb/7xj5+nLFcs5BliTBMKZV0/EgI3IZDn8XFEo9cbPNvf+kSkiRRa5TE4d4WmfTZfAoSWPUabtVlOhifdOne1COJE6JQWPJAFEPN0E584Zqukc6n5KFYf9743jWSKGFvp4ujK4zHAYEvmCr9oc+P3trnqDtHkiS2t/p4gxFxnKJaFoNxwLkHVtBVmTd/dIskiiAOsGyD8VFP4G4XNwrxncwjX3gcRVE43DmE5hrT4RSpuQymS5QIjO3r7xwgSdA7HpOOevjTOeVamcsbDf6X79zm1v5YHDwf3aBSc+msdURYT3FJsoTliolGHorRvKqplKolzj+0we0Pd8VOfzoUKYBZdtLZk0OWZmRZznQaoioSvh8zn8cF+z5jMPAZTUO2tkf4M58/e+UDbFvHdkw2CpDSX3R9cqEfDESXN5+jlB3yXg/7sfOo60tICx30S6eRyiWYzdBOLYkxf5Qg2ybd17fA91EunBVM7Ne+z/ztO2InvrSEdm6ddKcQcqUp4a09srffQrY0ZENlPIm42/Pws5wN20SWoG6ojJKEiqIUmFlRZHRJdO+6JAsXUAGRkYDHH1kgzDNcRaGqKEJ1LsnC31wTiXtK2aFWM4TGIEqxazbOY2dRHIMkzliqm6QTAaaYbA8onWrAdMr6I8u06qbosJOczz/QZpQkPLFR57xl8nDF4Zs3jqjXDT7wQmxbpXZRqNTPmQb54SFfePE8OztT8iRjFKe0NI1kMKP2yBqS62Kst/He3sT3U37wT9/Cm4YsrZRRJRHkkxaeeMdWUVwDVxZ0wGGScm65xOa1PbJirF9VFYxibJ8BsiRhmgJHqSoS5nqT8MZtlJLN/L09sjAm8yLshzeQbYMsTmksuJiGirK6hKRrcO7cv/pT62d4bY52USSZvj+hpNvcHG7zzMo652qnWHSbPNJ5gKtt8TOslZuYqn4SdfwnNz5iFvlcaV5ARuKd7nu8e7yNKqtsVFYpGw77sy5V00RTNH60/z5780MMRcFWVSaTOYfdIXGS0FioYeg6jmUynswpVxw0VWHuByRJiqYqovBnGbIsClhSjCa/+sLjhFGM5GromnaizXAsk2bJoWE2cHUdTVeZRj5xltGwLJ5cusQwmDD0fWr1MreHIpaz53k8tbLG1niPy6dXqJYdJFWsoJ58+kFGoynPPHmZ9lqT0w+s8u1vvEZ7oY73UY9y2eFUvYYsy5Q7FfzE56WvPsvdnWOO5nPCIMKtuwyDOZ/dOE3TqrPo1nj94A6zmc9//3vfYDiYsLDYAF1BVmXSTECCHNfiXL1OluUYusZk5rG62GL7ziFpJgq70XDQdI0wiohicXjSdQ1NVZBlmQc6bT7q3wXg9YPrNOwqkzCgbrosuE28OKDVqtLuiDS2LM85Xf7pndCn5To+ntNuOSLUytH5cG9MuWxSLZmMZxGWpfHMoysYhsrGuQVkWaJSMVlqOnxwfRdFESp9gA/3J+wdTonjjE7VYjwNiNOMUsmg7Ojc3B2x3ZsjyRJJkuF7AVEQkWUZ7kKHJE6wHIv5eI5TETz3PMtRVCFE03RNeOYNXYB3kpQ8y1k91RZjdVnFKJfEqL/VgVTsr+NUuK1MS0SvpmlGrWax2HZptUTol9uqEwQJyaDL5s1jdF1hHiZsXFzBqZTALpPncO7qWYgDOqtt7LKLWm/x1g/fF1a+g9uojQ6Wa2G7wu42nIVsPP0Yw+4Y308YHQ9wllYgSXBcg/WmhWEo3Hh7h2AecP3aJr4XiTwAVRTqez+rJEmouvgZVV0lCiIUTeFwb1AU8gy11hKvUxQTBWJSk6ap2OvLEratcXdf/Btt3z6iXLVJ4gRFkRgOffE5JWFrdByNctlEVX56Of/EQp+HkRDj5blQXutCSRnfvivY6+9tCt45wGSCWrUxHzpDOp5TXqkK9XqrDXEsIAmtktjR+z54HuHuUEwIJlO0uoO0sUHmx2jNEoYhVOEbrsmuH2LKMmkqUuJKqkKai6haW5F58oUNwixjo2oJhC1iCuAoMnubQ5qahipJ2MULIUlw5nIT73gKbgkWFsQUIojIwphJd07aH+HfOiJOMoIgxf2Nv8Z4HOJULWRLJz86Rv/MA6x98RIdTWMwCPjj947RJBmzZLDQsTn/1Cq6LDOdRry4UCZNM7QzqxxFMZqmQJahPXiOh//uf4U3DlipmZzfKIMkicS67W3iwyFIsHR1CUWS0HQFreny+c+d4VzHFShycqyVOnffPz7x1lcUhfqFDj0vJigmH1kuin1dU8VKRJVp1E1Wz9TRdAXJddAXq6RTD71VQu9UUVsV0DTSmVDcz0c+9Y7D5E+uCc7C7dv/Go+vf/PXPI5RZZWKYRcIVpeVUofjgn//1tH7jEMhEOv5ApLTthuEacQD60vUzQqmYhAXudqyJGGpBqOCprY1OmZzOCRMIhRJomO3sDSNsmFQLjvM/YDTKx36vTGaqhAlCUkQYRk6YSQU5I5t8vhjF4mThLWlNkmSEicpUSxO8n/4vWvUKyURqWqbRAV975mr57m1c4ij2ZypLnGh1WQUBPQ9j1uDAbIk8V73gLgg9v3tR36ZOEpYcF2yLCVOY55YOsO/9bknuHBmlcl4zo+/f535PGC1XGZ5ucXzD18EwPdCOp9ZxfMCLjWXmB5NqJZdtsZ7PLV8mr/z6/8R/bnHmVNLXL50mkkY8u7xEf1gyN1Jn4phcHZtAT8UAsaSY/HlF5/gkStnUQul8tnVBV67fYc0E2lsEhKLy036wzFZLBK9ojimXnGpVQS50DYNHNuk2a5SKtnYmoataaR5iq2ZWKrBRnWRttMo0gkzVlp1ri4u8K3b15lFHjvTT7dzBMTDXFdlkiRjNI9QiiI8GPsYhnJirRK7W+G9PrdSJU4yaq0KJVPjynIJQ5MZTgKaDZtKxUCVi+jVgyl7213STKjxO1WLet1B1xUR1xpGOCUHb+qdhK4kkRjxJ3GCYYno2aX1NkmcUGmK0bIkSWRphlN22N3uinF5nmHapsBKhzFWpYxlG5iaRLPp0Fmq0T2eMp2G7O8OGc9ChkOf0A+ZjWb8jefWIRF/tyTB3v6EjbUqaxttqu0686nHzTc/Qm+0MQwFy7V47Olz4M8Y98dQXSAZ9ugslJnv76JZJt2+h+fF/Ke/8RieF1FfaOJWXey6WL/tDQOGQ59qs0pzoXbiWmg0bBbWOiytt5FlGUVRWF6tE3qhSAKMEjRDo1pziMKIzBevHxJYjoVpmyhFuFW9WWbtlDjoNqoWcZySZTkXLi9RrZqUyhayLGOawuYXBRGmbfLBjT36/Tl+lPzU++cTC30Wp+jn12B9HYZD0fEeHSNriujiGy7IMrnng2kKupckkac52TwS9qsjIdaSFhfR19ownwu7mqJgP7SOrCpkfoRcq8BgALKEtrbAdBJhyjJHXsSKZeDIMvNUkN/6cYIpy3STBBWJPM1QJYmjSUhZFd2+IYsOP8+FSl2XJMJMjKpNTcG6vI5h6wLtenCAbOokQ48syej1Bft+0POZzxPcksb8n35TYHbjBEmRkCyT2TdfJToY8uBji3ixQN7m5Ay7c9yKweZPdnlmvUatarC44HD60RVx0AFKrkb4/hb0emR/9AfIssSZf+d5kTNt6einl/Df3ST1QuJpgGxpPP1bz1F74gySrqLYBneO52Q5rFQtJEXm9sAjyEQGwNVzNbIwJs5z1gydrMgLOLVaQitEigCNZ8+jdyrotg5xjFQuE2z10E8tkEcx8eGQZPeIZOQR7Y9EdONGC6NVIusNhB3vU3wlBT3ugcZZwiQmyzNuDncYBqJQG6rOPPaZhHPKuk2cxgwCUfAVSWJ3dkg36BFlEXWrwsXGEpPiYDCPfZ5beZDVcplBMONiY50PB7cZBQElw2A8nlF2be4e9FhcaKBpKp4fYroWveEE09AIo5ggjIgiAXMajqfIsnACGLqOY5tUyyIZzjQ0oijGsU0sQ+fn1h/EdS3mscc7x3dwdB1H09ifzZhNPa4dfsROd0AcJVSqLv/ne988CcwwVQNVVvmDj97hYDZjZblF4IcQZ4ync7ZGI0plh2s37/DwkxexLINarcTlU8skWQq2SrXqcnt4TJTG/LOb3yNNU37rhc+y0qhSNgweX1zln338E272+3x01OUzCyv857/2ZS6fW2O9UmHJdXn3vS2yPGNloUFJ17n+/hZ5mjOfB5zbWCZJUuZ+SL1RJggjNFX8vYamYegaeQ5PXz7LSr2KJEkESYKpGtwZH9O0qgyDKV4ScGe0x+Zol73plOPJjJVSE1NV6flj3uvd+au6Pf+lL1kWHflaRxxwhqMAy1KZzWImk5BWy8ELE0xTRZYlajURZTqchTiOzoe7I/bGEXsDj6sbTRxDPbEmjschl882aHREut2pxTKvXj9kMgmwbQ1/5mO7NlEYUaqWMCyD0A8xHEuMvyWJ2XjGfDRFUWQU9V6mvYSsyEiydFLwoyDCrFWZT8XEoNKocOHyMq5rCBS3HwtCYhgTxyn+XDxfvHlAHMW0V9p87dUdmPXpH/SpV0Re++3tEfW6RbnqiN14OCfPc0aDOUmUsLnZp7S6RhInlBcXqKytMx4HJ+K9vZ0+cRTzg4/7+POAzz+7IXQtQUS1avLG+8di3RGEnF6v8tnPX6ZSFYEzgReyc2ufJPCptWskSca4Jzz1qq5SrpeZTsRUxGk2TlT1paqLbuoYprBAep5wQlSrFlMvEmE213dwbJ2ya1CrWXSPRsiyxHgcUKq6NDtVmp0qtq3z3ke9n37/fNLNpdQr0OnA0ZFQoqcp0fEEZX0ZDIM8SYl2e8QjD+/dLdRWBflv/Lto/+HfRnZ0sbv/8JbApe7uQa0mUtnSVIzw8xx1Y4V4OCc56JMd95BkmfCjHRRVwtQE6U3X7ke73iNxZoiEtnbdJAtjHj1XI0F414Piv8d+9QrjWcywCI+5p1JXZDh65T2crzwLSYL/8T5ZECHpCqNRyMbFBnmasXBlET9JMQ2FyfGcpY6NbGqkXkQymOI+vE7cnWKdW8KQZS44Fk8+tnzy/c6CBF2XxXQDmN8dkN3ZoawotFbLpPOQ3A+IDkbUHl7l8Pd/hKLKaA2XG7/9I6yzC/TvDOkPQja/v0my34V6neMbB6itCmahoHcdse5QJYm6qqIgibzqvRGzLGWSiJVAXdfodn10TWa5KsZj3vt3md8+Qqs75HEMlQrmegNcF2l1Be3yWdSNVeyLizhX12lcWSaLUpSKjexYJP3ZX3jvfFquhmXTthsczI+Jspi96THzKGKlLJT0EhIDf8zOZEjXmyBJEl/d+BIvrD4DwN60x+70iCiNuTncYcltU9JtvDigZpbRFR2tSOO6M95j4Asb0cAvLDKGgeuYaJqKKsuoikwYRdiWYL5LEmysL6KqCo8+fP7EQx7FMUmS8p985UV29o+FAjqMUVVFiPbynH/y7qv82tXHyfKUDw6OyPKc7bH4+y+tLOLHMY+srxDHCSXLZG86pVJ1SbIMWZKYRD5Pr66yMxjxmYVFDFPnwsMb/PyzD5NlOYoi480D2s0qrWqJWr1M1/MIkwjD1Dm11CLLcxHvO5txZXGB//vaT7A0jdVyi//t269wplZjNJzSPRrwe29c485oxEq5zI39Q05X2yfxvrZj0nYcAj9EUiR0Q8MtlNqkOUEQ0a5XaDeqHB4PsUyddqPC3A+43R9wc/cIyzbw4piy4VDWDUq6Q80sca62zqnqMk27ytX2Ko+tLjONPE5Vq1QNh9GnHOMMnDgjpn6MrglVehCI1YVtaziWxngeMZ9HaJrCbBax3nJRVZly2aDbnbN5NEWRJW4fTliq26iqjCxL1OsWlq6iaQp+lPDerT7DoUeaZuzuDE5EdIZlnHSfAOScjOtlWaa9ukCaZrSXGieFPfRDkijhxc+dI45i0iQlCqITVX4URuxsD3jmilhp+n6M50W0Fqp4Uw/TNul256yfapBnObajMx4HKKeuYDomB8cz9vfGzGbh/9PeewVplp73fb+T43e+nDr3dPdM9+SwYTZgd4HdBZEIEJAJMBSzSJasYtksVfFKrJJLtkvyjcqUgxzkMiyXZLFIUCRNWjQYAZLaXQKbF9jJ0z3Tub8cTj7HF29jZFcRoEkWCYjV/6uZqZrzpVPneZ/n+Qc6HZ9TS2W8ioczu8CFq0sAWK5YM0RhRKVRwXItsiyje9AH3aLaEsU3SzP2Dsasnm7wu398F1WVWT4zy6339zAM8Z0risLb7+xw83aHSsVia2tAe7aE5VjIuoFuiMOndKwAy/OcOIopeP+eiOgWXQzLYPv9u6iqSqHkMh1NiYKI6TSmUNAxdZXFGY9qu0oQJqiKzPKMR7lawHE0FhaK9A77ZFnO/FyRYtFg2PuL6ugNA0nTQNOQGzUolTCWW8IFL46JdvtoDY88SpA0BempD5D9zq+Sf/HXsF5+BhQF2THAsogOR+QPHoLnkW9vk/aGIglta0fY0R4NCe4dIWky6TjEtjXSVIybB77oTJu2TnisDQ+Ok9j2u74g8H3iaTxFIc4FOQ8gPfZwliUe5bPLSGz85Evs7U/F51AUOg8HyJUSesNjNI55eLeHbGjo8w3SHNyWh+OozL5wBvvaaRTHwL8jutxBz+fW//k2V/7hj6JrMta5RaI4QzZUzp6rk2U5+bF2v7DaRHZtrqyWIc1RHAOp6BEcDJEbNcIwxTs7g2xonP/JD4FhMPu3P057ucTyk4uoq4vE79+j4GqQ5zzxf/1zdFlid28iViDknL/SRJUlvKUqr97pUtc0DFni9Kkirqsdm2SkbPV8Vk4VUYs2UZSR9KZItg1BIH6zPCe9c59s6yH53r4wR1pYQLl6EcU1SLpjpI2zf/4n1l8zHM2mbtVpOw083WW20OBUqf3IJW8ST2k4FUqmyeF0ygvz1/lnb/8rXtn9Kk/OrlE0bFRZYaEwxyCYcn+wQ8n0eDDcox+MUCWFSRQxU6jSDwIeDIdEaSq6+oL9qECnaUoYJxQLjjCJkYXnfblYoNMZoGsqP339JbyCTZyIkb50PHqXJJj4AVkuiDqyLPFDLz3Ne1+7h62ZKLJwP6vbRdYqFSZjn1t7h9SdAo+1TxMEEVXLouE4/NgTz3OxMUfFKnK/30eRFFRV4bfe/Tr/5cd/BEVVuNCYIU1TlopFLqwvIUsSe50+dc9lpVwmzVM2Ti9QMsXDq2x6pFnGvFfj4KDPuZpgtv/sd3+MK811/v7Ln2N+scVja0s8M3+adw8OcB2L2719fuXv/gMA+v0xQZKg6RrPPXMJXVPRdI2vfuV9ypUCkgTtdhXHMTF1jaPekM3tQ1r1MjXXQZJgNJzQdgsMwwmaojGKJmyPDni/cxdFkhkEI6IsYanYQpYkDiYTztVO4xnf+Tp6RZGpuAZRnFK0RR76ynyJUsmk7BqMp2LPa1ka43HIhZUqd/aHqLLM6bkSmqagyBIzJYvDY9a9aajc2Oo9SohLj4lgQZAw7A7RdeFxrxzzH+IoFn4PeS526lGM5YggHNM2hS48zVlfPZbpHnvZq7p4HUUVMjNZFl1+HMV89hMXCSYBw2lMmufsPjjCcXQkCdJEkABNU2N5xhNGPYDrGjz5gTM8/+wqt772UOzEVZnRKOT1r27xk5+5ICJwNeE0Was5LK+1MKA5AeIAACAASURBVEyDUX+EbetC9qYq1FdPMTga4FU8Gq0ivZ4Y4Yd+yOysh6rKbJyfIY4zrl9sU/BMSmWHRsPB92PiKCGKUj74/GlkRWY8mHC4P0CSJNyii6Io2I7J3a8/wKt4yIqYEhiWQWm2hT/xOdrrUKx6lCru8UEnRpKgOwppNFx8P6E3DHhwMMa2daEwG4RcfWIZw1BJ0oyltsfqmb9geh31OrnvCzOAIBBdeKEgRvWShHXxFJKuoxQshjtD8lf/SJitDIfk0ynYtihAW1sY64tIXgGpXEN+8mkkWTpmvOcig9o2UMs2imMiqTJFTyfKMkxZ5vIzi+iyTL1u0dA1Vo5DZHRJpuhoqCWbP/mFf8soTcWIWpKoqApZnDxy1MvJefKHn8LVFP7kn/4WkyQTev4woLHehEKBwdd3aTYsSkVdaOzLQo4iOwbO2Vn2vnSTvNdn/PUd7NUm4cMuvp/Qatrs/je/yJmPbZCPJ3gFnTzJGHampGmOYunIqkzSnzB89RZWzcVaa6LYIt99Z2cCsoymy2TTCBoNoVjY7wGglhy2X38opG+A97TYm+a/9L9SKxosLhSQNIWComCdmeXCWpnxVhdXVjh7qkgOvHGrR/tUma2xMGZRJAnVNdAqDqapoNZcogcHYJpEu33yvX0RqiNLSMen9uiVN5GcAqTC7Cg/OkRxzb/ww+uvA/NeEz8RARWTeIqr29iaSZhGxGmCoei0nBq2qnF395Dff/AKc4UqiqSQZCltt86D4SHvd2+zUp555Bc/V2gyiX0myZQ0z1GO9e8FXWfeKxOlKZZlEByP2j/42DlkWWJxqU25UuDK5TWSUHTtiiJib3/+l/6liG9Nc9I0w7EN0jzHtS1BVrNM/pNPfRe6rvE//8bvE4YRaZ6S5SlrM0083eGVTREyVfYcuv4EV3OEvjdN2ai2+bWbQjHw2/fe42prjs1Bl4P9LpVqkf/qj7/As2dXOfJHLFTKBGlKdzIlTFMKnsMkjtEVhdd2tnlqbZnLzSVKpsmM2+C9r99HlzV0XcVQdYIkwjNc0jwlzzOemJnlrfsPSbNMjPVn5iibJv/t61/Acx2WltsUDRFT/eGVDc6fO0W/N8LxbNZW5gC4fXebM8uzjKeCjCRJYNkGBV1HVRUKnsPmoE9Bt9ke9Tny+5RNYW/rJyHboyEPhh1m3CamauDqOoNwSM36zpaIAkRRwtaRYJ8fDQOaZQvt2MkuB1oVm6Kjk+c5URjz2rv7dLs+nq3x9u0jmnWH3jDg3uGE9eUKhqbgGCqqKuNHCVGSYZrao3gBt+gShjGGZWCaKv7ER1VV1jbaZGnG0qkabtFlcVkUdfmY/zSdBHzxt94RVrBZThKEmLbJaBQSjKfIx6P9l144g+M5/OKvvy1G7NOIaZRx9uI8WZYTRYKYluc5e9tdVFk+to6VaTUcbt08RD2+lqIqTMcB3YMexUqB/+kL71Btljk6mlKpFphOI8bjEFmRacxU6XVGBNOA0A9ptT2uXl+l0RT3Sf+wj64r2K6NpsqoqkylYNLvTYnSjGbTZTjwmUxiRqOQpWXhvX/34QDHc2jPVZAVGd3UabTLLK40GfbGmI5JuSoIiGmacrRzJKYdcUISJew/OMA0Rby1YSjsHUxQVZkHmx0GgwBVlbl/r0uhoHP75gGaJvga5bJJp+Oz1/NxHf2b3j/futADUrkiyHOGIXztp1OkdkuM4BsN4YgUJZRXalAsCgLe2ppwTIpj0e17njgklErkgy75G19BXpgTCXhAOg7Q6wVkQxPmNUlGGKYstF1cU8FYqFMq6tx9MOIoTrj1YMS8oSNLcOaHnkOZaXL5xRUAWppGQVUYpRn6mSVato4hyZxfLPLVf/EKF3/4adRjKd431gdavUBy4x7lx5ao/70fo9BwUWwdJhOuXG0R7fYJd3rIMoQ7PQrnZkknIfZ6m1MfEh1+pxtw9Oo95E9+BqPpYS7V6PUCKmeaJAOfLMlQKy7mfAX/aIxUKgpXwSji7M//MBgG9XNtMX5IEmE77Oj4v/kl1IrD4g8+j+R5aHN1KJUgz7n7y69hmip3t0YkA5+So4Fl4T62wngck5JTOD/HlfMNrj+7yI0397m6VOKJv/MS63MF7I8/R57lHB4F5FGCfmoW0lS4FdaqxEdD0nHI9MYu2kIT47knyCcjpGoF1bPJHzwk7k++xd3z7cfBVDjEpXmKIinc6T1gGgd4egFNUZnzmmiyRpylXFqcxU9i5gpNlopz9AMxCksyQQRLsoR+OGJndEAvGDJbaPB+5x4zbpFRNGWmUCLNc/YnQwq60MXXyh6qLLNWadJqVLhxcws/iPjae/eo1Iq4tsnf++5PcLa2wHc9eZEkzajWihQ9h/Ek4GprAV3XMA2NSxdX+cJXXufnvue7jyV4JmmWMU184jTl9za/zrX5Wf7Rh38C85jB3wsGXFpbpB8E9IIhcZaxN+nyzPwqlmqwVmnw8avn0WSZOEp47fZ9Xlp8ktVyjdVyjc7RgDPVquAGaBqebrFerfKV+w9o2jU0RSHOYv7xZ3+Esunx+PlVVFl5tBKZxgG/u/VVAH78+gs0nAqnq1Vm3Bols8Br79zGNDTefe8ecRoz26xiqQbX5mcZDie4tsUHVpf54NOXePGZy7z+7h2WFlr87Kc+xtJck5965kVOlWr0uqNjp781ojSh5XrMFhpsDg5QZZXbvW0uNheY96rcHzzEUMRDcX/aYRh9Z/NMQPChJ0HMYWdKpzPFj1I6w4BWxRbZ5IaKoSk0Gi4Li2WqVYvVhRL7PR9FkY51JDANEyZBjKEqbO2O6HSmmJrC/YMR5bJJmmYsL5awbB3HMTAMhTTNqDQrwukuhyRJ2LzfIZgG3Hp/F6foIEkSZ862cVyTFz8sOmqv4mEXCwSTgGrVpjHfQpIl5pab/N6Xb/P4Y4skSUIURAzGIWkOw2HIaBTQbLr86GevYZrHk7cg5vT5JSRJYjASRfvG/R5zSw2qVZszG02WVlskiYjY7R70mZvxaDQczpyqMuxPWTpVY+/hIeVqAcsVI/v9vRGnWgVMU8WyVH7ks49RLVk0mgVkSWI0Crmz1adYsvj6XfEsuXi+zXNXZpAkiapnUqvabN7dR9M19nf7ItRGUymVTEFmNMQk9cxKlUa7wuLqjDALapb57CcvUZup8cSzp4VvfX9MGKZcXKuRphnVusfKcpleT2jx+/0Ap2ChKCJBbzgM6femgo3f/+YrqG9Z6IPffYX8/j2+IbZWL5whHwyRlk6J8Jpul3w0RrZ1jBefFTn1I6GRZDQC2xaEvXYb6doTgmy3t4d08RJSrYbeKqJWCxgbS0hegaP3doV5zeEEXZcf7Z/CzQPKZYPVRY+nz1S5/NgM5aIgb6BpEMe88sVbvPBzn0aTJBxV4cmzNaRWiyjOaFRMai9f5tKLq2z/xutUKyarix7RXg+SGGlhQcRgLiyQb93jYHNAMvQhy7Ae38B95hyKbeD7KeYHrhEfjpBNnWi3j1Qq0t0d0ahbuHWH/N99icl2H2V5gZXrC6glm8yPuL85Qp2poy+2KH/4GmQZcXci+At3boGqotg62kxNHKxkmbgzxrp+nmi3L/7t7BWRJeD7JJ0R212f2ZdFbG4wChlPE/F9ZIJf8OS1GQAUW0f1LNbOVChvtNn813+EaSowGhE+7JIkGfHhSPzfKILVVZIHe2jXLhD3J9jn5o8ti2UkWZgJqa0K0tws1rmlv9wT7K8Yv/q1t+gEXRRJwdZMFrw2k3iKpZposkqYxPSCAbqi8v0bH8HVTMpGiW7Qox9OcDWb9eo8iqTQcmoMwzGObtNya1iqhaHozBYatJwaYRKxORhgaxpbh11sW8TU1uslRtGEeqPMU1fX+eAzl/jwU5coFRwGowk1q4SlGvzKH/wJ//WP/TiOZaLKMlcurnK5scF44nN2fYnvWjvLRy6d5V++9gqz7RrtdpX9aZcgCfjQ0nmqlsX1mfPcGd7jaDxhEIaEacSl5jwfW72IrQm3yY+f+iCH0x5+EhJnwuhnMJ6y3KjSblR4v3uHe/0OG9UVnt5YYa5QxZ+G7Bz1WCnP88zcRT5x/iK6ojONY3rBkNu9TSpmmdOVOhIyWS6uuzc54gNzl9ifjNEVlaeaT/HC/HVu97bZHnXp9IZ8/MlLhFFMlKVMpuJhVTAs5heafPTpyxR0G1WWWSgWuXp+hRcvrPN/vPIqWZYxjqe8sr2JqipMxj7jaIomK2xUl+n6A8qmw5E/wNVNwiSmoNuYx5a352oLOJpFzSp8O2/R/184OBBWsMWiwWQSYagyQZRyOAjwbJ3eJGISJMxUbVZnS8Rxyo17PUxdIQxT0iyn7JmEcYoiyzw4GuP7MVfPNbm8WBLCqrLNXN3l4c6Qzn4f29Y42hd74DzP0QyN8ThCVVVqdY+FlTbXHlsS+QTTgPW5EsP+hC9/6QaXri2jqAqGZYhDgir28fV2lesX21y4PMerr97DKTg05kSQ1CgQITq2rbMxX+L+oSDUlWseu50JpZLJ2lIZw1AJ/ZDnL7fFSkIRZL9KxaJ32BOMflni3maPnZ0h7bLF6fUmjYqNqqrsb3fQdI3ZWY+zG+K1Dw8nhGHKG3c6nG4VWJ4tCi6LLKFpMqapcmmtxmAQ8tY7O7y8WuZTzy6y15nQ6fqQQ3umSDAN0I4ldd/43dozJR67vsJwGuH7McWigWmb9A4H/KsvvM6oN6JRtNjbGxN/I+cBmExiPM9AliR0XUwujg5HmKaKYagM+1PCMGF2rsj29vARufJPw7cs9HqzKLr09DiIo9dDMg2YiCKO4yB5BWRNIb9zG0olojffFyPxOAbHQZ5pCR395l2o10V6XecINB2pXCYdTZEWl8DzaF5fRnniccIwRa8VUBSJctnAOLtM8fppdF3B+48+BFnO7NkGFz56hjv/2++BolAyVNJ33+fCS2tc/JmP0jn0yff2iLOcxe99itf+ly8THQypLVdY+MQVKs+fQ1+eAcMEXRf59KpKdvsujqOit0tiZTEYIK2fQ9JEnG7vl38fteqSxYkYW9s2c597DrfhYp1bhCShsNqAJEF/+hrhTg+9XWIYJ0KZMBiKgqqq6AsNsm6f6Ws3QJJQf/bnoVoVNsOSxGizS3J7S7yXKCJ/6zX8X/oNpGdegDzn6rk6kzfu8th/+nEMW2P9bE3IHXUdfX2Z0cGYdBKiuCbK1Ys4FxfYf3ubxR94juLnXkZqNLEuLDO/XsPemBGe9poGh4dIqox07WmsjQWhFIiEtWMex6TTEFxXHPYqlb/kI+yvFmcaNbJjIqapGozjKWme8eWHX8FUTGpWGVe3qVhFXtt7g/P1Nf7Fe7+Nn4S4mim6/TShbHpMYp+y6eHpwmDFUR0KusPhtIurW6xWFnh+4RTPzz8GQKXooqgKlWqRtfICH1pepWbbXG3NsuCVOXVqhueeusg//r//DbqiUXBsPv/GH9JuV/kvPvuDdI4GxFmCY5v81JMv8j9+8XfZHAxYmWvy6ceu8OkrV1gtzWMoxiOC1CAc8cbeLWEbW29jKDqqrLBeFh38U7ML/Hdv/BtcTcTthmlE0XT5vktPkmQZ69UqnuGyWKzgJwHPL1xia3iEV3TodoZ4eoF3j25Tt8tEacRyUTwof+fuLZIs5sfP/SBFw+V0ZZk0T/n9e3e4P9xm3itjqgav7L/CL3zlV/no8geYcUs89eQ57vR6/MMf+ByqLPPU1fXjOGCT7zl3mXu9HnuTAeu1Bs/OXeal5XN8+cYd/s4LL/EPPvn9LHtzXG7NML/Q5Ox8Gz8JiY5zCPrhiM+sfYwgSdioLjOOJsy4TWRJQpUVCse/Y93+zr6HAYbdIZ6tUbR1HEfnoO9TPta9SxIULE08oscRf/zGNnduCpa4psoUCga2oTL2Y+ZrDoNpxMUl8Zn3ej6bRz5rMx5RnBIeGw9VmyXmWwVs16ZQMHBdQTRbPpaxaZqM5+l4toblmJy7vMhvv7JJseximAadjsho+MSLp4/H/jJxGPPhpxb5jd+5QZrmnFprcuXyLDMzBZxjjf9M08W2dXZ7U2RJwp9GtNsFJEmiXbE503aFPHqtwed/6XWKRYMwFDI034954fkzxGHMwrLo9GdnPcIk4+x8ibtbA4pVD0VVWFwskec5iiwRJhlxnJKmOcNhiCTBT19fYKlRoF53qJYt+v2A3Z5PluU0mh7//JWHfP7X3uMT12YYDgPOXZonDFO+71OXMU2VmcUG6wtlajWbCytVDg4mpGnOzEyBD19qc+HKAnbBZmlthseeWCbLc4pFg+ZsFes4ldC2NRrHE5ufenmZOIpptoqP3muapCRJhq4r1GrOI0XNn4ZvPbrPc9GFf8M+bWYG5ubE3v7omMqvaUhrq+LP4zH65XXwPPA8pEqVbHdfSOqiiOSdG4+kdfn9exAEqJ/6HuFHrKpIp9fI+z1aGw3UqktxtYHR9EQBarVwX7xK+tZ7KJ6FsTKD+oFnWX5uBVSVCz/zMZKhj7qxgrR4isUfeRGiCEsX137i5z4tonZliax73CFrmnjtnR3yNBPe+3lO7fmzYhIRBKSDCUzHaPUi2jOPEwYpyplVMj9CnakhlcvQ62G9dJ30qA/VKuF2TxTC8ZjwcIRs6TzzmYvioOQ64iCk65AkyLUK9nc/B4ZB9vl/Kr4fTYNiEcszIc+RGg2irX04PES2dPCnqJfP4jy+hvPUWaSzl1CLFke7Y1GULQtMk/rjS0iy4AYQBCTdsSDydTpweEg+HEC9jnV2Edk2iXa6Qh1hmoJN/+CueJ+2jX9rl7zbgelUyCuDQFzz3Zt/yUfYXy3O1uaRJYkojbBVi1PFhUc2qHuTQ6IsQpM1LtfPYWsmX917j+cW1mnaNQxV+Mw/GB0gIeEnITvjDrZm4mgWR8ERtmayXl1Bl3Vs1WK5NMcoGnFtaZ66bVNvlJnxCtzsbrJYnGG9OsMo8lFkmafmFvnh8y9weW0RR7X5u9/1Mqv1Kh9YXWbGafHzn/xe0ixhbqaOKiv853/rBwiShKplHe/mRSCIKquESYQsSZRNwSH4vktPPurWt0dCHth261xurNPzfZ5oXSZKY3RZI8syCrrDJ1YfI0gSbNXkwbBHUffI8ozD6ZSW6/LTL71IN+ixXllia7hHnInuw9VsPnvuCeIs4R/9yT/D1VwkJFZKi5RtiyAJWSnNszM64GDaJUpTVFljuTTH584+yXPzq1xtXMBQdL52+wGKJLLqF7wZnl9coWq57I2HTGLBtZAkiaNpn5vd+8iSwnyhybl6nZbr8ur2Fm/vP2R/0mUcRYyiEQteDV3W2RoOHl1DliTSPEWTVW71Hnwb7sw/H5qzFXYOJhiaQhgmaJpC0daJ4pTpMfs+SXOeWCkjyxJr6y0qFYsoyURRtjXCMGEwjTg4mPB7X90+jqqF8JjkZugKg2mEaWq4rsHD/TFz80UhW/MMlpbKjKYRrquz0PZ4+GDANEwI/JDFVoFKxWZjtcqFy3PYttCOVx2NH/3UeY6Opiiqgq7K/MSnLzAahUwmMdMwwbV1/FBEL0dJdjzNFXLC5VNV0jQjSTJef2+fYZBSdg1sSyOYBFxZqtDrTQmCWITAqDLr59pEUYrr6AyHIboiMw4SHEfDNFWeeGKJwUD4v2w+HGLpIitgrl3AtlVkSeKf/MFd3GOmfcHSKBQMRpOQLMtRVZnD7pRq3aNiq9RqNiVXZ3m+iK3LFIsmaZoTxRnGcVDPwjGx7+hoypduHGHqCo4rMkZ298fESUbJM7m80aBZttjtClLgyBdeGlVTZ+Nsi52HXeI4YzgMicKIctmiUbJIkvSb3jvwZxR6ud0URTY5HglPJoKZ/Y0iaZowHoti//RzSLOzoqiHIfn+Afn+HvLGGXFYGAxQyw70+7C1Rd7rwcIC2W/+OnmvIxj+hgF7QnevnFlFu34F8+VnReb5wQHYNtHBEH1ljnw6Zfz5X0FenBfTBVkmPhyJNDXdgJ0dcBzWnpyHIOCdf/JrmIs1zM99GklX8e8dkB8cguOSHvYwZsrk29vCmvfCJSTXFROKgyH5nduEW4fku7tUzjTA9zE/dB3abfJOR0Q06jrKbFPs1l0D6dpTEEUU/9YHkQ0VaXUFymXC2zuiEAcByDLp/vGBaTIRE49OB2n9AgDOZ15EtjQYj5EUmej2A2E6pOmEf/g60vIy2cER+Y13MS+fxivqpA92RLSsppFFCdH+AOviKdL3b5H0J1hLNfHbVavisBaG4jCxvoG+Ooe0vASjEeZHniff28F/6zakKdbZBXF40TSkuVmRbHh4iPZdL/4FH11/PXB0kRefZOkjxztD0anbZezjjv3BaJcojblcP8+11jkKuoOlmry9/1CQONvnOZiKeNqCbjGJfQ6nPUbRhLbT5N/efY1+OESWZCaxz+74EAl4cekyP3zxWb5v44NcbJym4w8EQS3LOFWaQ0LiF175TRa8IoosLEPfuveAa81zxFnM0bSPoRh86solxtGU/+xX/zXn6g1+4sJnKBkFHgy7bI8OKGguqqxyvr7I1nCH7dGIuUIbz3DQFJWDyYRDv8PN7n2G0ZgPLZ3mZv8upytLXG6sE6YRt3qbmKrJqVKbJEuJs4yW3WQYjfnoymUAFgozVM0Kh9MeG9VlJrGPqRr0giFl02MQjbg+s0Ev7DPvztG2W3z/2edIs4xuMODh6IjbvR1mCwVMxeC/f/WLlI0SN7q7PBg/pKDblMoFdieHnK+eJskSDFXnVveQjdosf/zwPXbGRzx/ZgVHtzhdWaIbCsJqzSoy49ZZq1R4cfkCk3jKhxausTfd52ZXHEo2am2KukfdqjLnzrA36fDu4T2em3vs23Jv/nkQxxmeZ9Abh8RxRq/nEyUpRUcnSjIansnN2x32hzGNhotra1RcA0WWUCSJhmfSqjoMpzGlkkmz6QiS3bFuvVHQeev9Q0ajiGrFQpIgjlM0TWG24lApmrywXqdSMPEcnf3uhGLJomBphNOQ3/vDOyzPekRJhmOoHB1NKRR0upOENzf7hGHC6Y02fpTyP/ziGxSLJj/20ikcU6PbF53yYsUiSTMKrsHXbh5x62Gf1RmPWtF6NJbujEJub/apeSaXnljl9Xtd1k/XObdcZTgMub3Vp12xsW2N4ShEliV6k5DhNKJVc4jjjLJjkOdCXXB1o8GXv7qNW3KJ04xqyaI7jWlXbEZBwlOnayzWHE7NFen3AyoVS3izxBmLc55YoQQp7bLNyI/5k9sdZusumibWI8utAtvdCYoscfvWEeWyxb17PSZ+zMyMhyQJmWSS5QxGITc3exi6gmWozDVdTE1hpVVgaxBw88Yh/sSnVrOp1Wx0U6dVc4iSjIODCZdP17/p/fOtO3rThNlZURh8H3xf7N9BFPjJBEyT9K13yd9+nXx7G2lBmOtIK6dEAbEdQdzTNNHpBwG5HyA1GqJAGsJtj2P9L4aB8dzj4PtIikI+GIgOWJaFJexcRUwLHAfn4oJYLagqpCnup18Q1zjcE0VJ19GX20hXH+f047PCy30sDgPWs1eQih5EEZKuwuws6d4hWtUl390WRTcISCch2DbGxhK4LvHhiHxvH0YjpAvXkK48LkyCJhPx+Wwb+1Mvk7/5GvJnfgSpWhfKg9t34OgIyTz+Ho4168nAF8XWccSqIAzhcB9u3yYPQ+TZNlgWqmehL7ZEWmB7CbXqgqqJg87ODgQBhZUGSq1MvnVsPzw/g97whFIiStHqBfIoEd/3zg7pQQeiiGhzHxQFaXEJqT0HmkYeR0hzi5hzFfFbu2LCwXSKVK3C7CxSsUjeOfxLPsL+amGpBi27cbyPjgnTEFMxKZseXX9ALxhQNUu8fvAubx99jZ3xAUW9yNc7t3l+cYNJPBXBE4pG3S6zWGyzNz6iFwwp6A5BGlC1LKpWmY7fp24JFvL5+jISEsveItvjPRRJYRoHpFnGUrGFp3uUTY8PrqyyUT3FJJ6gyio/89xHMRSDJE/RFFH8/SRkrbTCRx6/QC/wGYQDFrwZrs+uM44mIt5WkijoYny3Uavzys6btJwaAP0gwNFsalYZSzW52z/g4WgfUzFoO20+eeojzBYaZHmGqRpkZPzQ2Q/zR7uv8aG5F5hx2mx1+2yNdhhEQ0bRhLJRoqA7zLpNJEnCVm1s1cTVHVRZYRyPuTW4TZYLwyIJiaZTZqnY4NOnP0Cap8xVSqR5ylqlRS8YkmQp87UyZdPjzkBYDRd0m9lCAUPRsY5VJ0XDwVB07vYfcjDposoKd/sHNO0a5+vLLHsLLHht/DRg3p1jtdxkHE9ZKy8yjseEaYilCrVI262gysq34c7886Fet6m4BkVHx/MMoiglTjKOhgFJmjGYxqiqzCvv7tGq2ARRiqUrbD4YsLs/5t7BmGkonNNOzxUZj0VnL8sy/WnEg26AZWkszXgcHk0ZjyOSJOPsfBlFljg/V+Srm326I8GhmK27rCyUCOOMhVMN5hdrWLrKzv6YnYMJc3MetbKNHyWMpzGKIrM2W6RV0Fk53RLhSeOYgqXx3IUWjZKFqkg4pkazKFLrXEdnr+ejyGJHHUUJlq6wfkqsHaJjj/522WKmqPN9zyww03TJc3BdnSTJaNXEemZ9psBM2WYy8hkHQsLm+zFNzyAMYs6crhPFKaamsNebYqgy+33x2jt9n+D40KMoEp5nMBwGXFoocacTMNN0UWSJimvg+zH7vSmapuAeryO2dkYMpzFeUYTVDDoDOp0p6vHnGnRH9IaCWb9594BpkGAbKitNl2rBoDeJ8EyFpeUKzdkab/3BGxwciHyKO5t9eqOQPEdMr78JvmWhT+/ch/198ZB3nEe6eqlUEh3hYEDWG6A0a6JLjWPyrU2ky4+R3bwNQH73jijUIIqYYSA1G9DvIxkGmR+K/b+iiIQhz0PyiuJad+6I/xNF4pBx965w0Ht4LDUr08hwhgAACHBJREFUl5HcgpgyHB0R/uFXyX2ffDT8f+2VZfJ33sRYqGPMlsknI2i1yDe3RDHPUuRGDcnzUBZmhV78eHpBnmOtNkVhdhwYDHCeXCdPUvw/egse3CV/901yPxBFvNlGWl0n9wVHIX/3NfIHm6K4yhLR4RC9XUEyTLH+iGOMZ66K6UEQQJqSHnbBnyCtrSE98byYpiQJWZQgrW+IaMi3/h3q3/6Phevg3h7xbhc0DbVdfUROJM/J9w9QCiYMBqTTEKVSRJ+rkXe6UK2ibJwGXUdxDXJ/Sn6wT37ja1CpCMOH7iHSwjzoOlJ7Bg4PBXkwjoVTYrEobJG/g7E9OmBvekBBF77QpmKiKxo1q0rRLLA/6XA47bJSmqduVUiyhK3RNpcb57jZfUiaZ7zfuYujWUyTgDCJWSrNslZeoB+M8HSPgm6jyRo1q0yeZ8wWmjiaGK/fH22J2NTER5YkBuEIR7e43b+Po1usV5dpOy1czaVsevzyjdeOg3UywQtIpriaxf3RJhu1BRa8GnGW0LAbvN+5z+Pti0hImKpBxSxTsYrsT8bMey26/gBdFgXSUkwc3SLJEzZqswRJwq/ffpWH421eP3xT5LfHE9pOgzOl04Do4Pene/TDPrquUtAdRtGEgu5gKAZHfo+MnHPV0+iKTj8c4SfCZTAnp2k1uFy7TJhGVK0Sg3DE83NPk+QJX3r4Kn//+k/Q8Xs4msUgFA5rZ+uiCBiKWIUMQ6HqGIYTOr6PpRqUTA8/DijoNpfqZ/GTEFWWibKIcTzlrcOvsVCYo6QX2fcP2KiuYKkGjmoLSWTsM4mnpFlK06k8sjf+TsZ0mpCkGXEiPOklSeJwEGDpKqaucG9/RKViPdrLVwuGcA281GYyiTjoTnm4M6To6EzDhG5XdIYLTZeCqVFxhTa/NwkplUwcR6dQ0OlNQnrjkFduHfFwb8RCXUgRdztTjGPjHs8zqJYt5iumsNVVJXq9QFhsFwzWZotYlsbIj7nX8fEKBo2iRW+asFi12DyaslR3sDWZkq2zXLcoujphlDIYh+wcjakWTTRNwdDEemHkxzTrDnfv93jl7V3uH/n8739wn+29Mf1JRLNk0ao7SMDNzT7dScJe3xdJcYho5GLRJM1y4fKnKVxerqBrCmM/5vbukN4wQJMlyo7BaquAriu0SzZbW31OLZW535nyxzePuLIkDNEkCWF4Y+vMz3jMVhyx/e5MeOeNrePduoiytW0dz9aJ44z5pRrnFoWHv+VYTMOE+9sDvnKnw2zZxNQUbh8FLDYKhGHCmScvEARibTYc+ERRQrFoohybKv1p+JaFPumMYWlJPNQLBcGon05hQbDupY9+D8qVS4+6fVxXGOI8vC888h1HsLXjYyZhmgpJXaEAjkP2zrvI7SaSqorxexwJDf7DB0Jj74s9MIbQm2eDEVgW+bHXN5omil2WkRz0UAqWKPyOIwpTkiCd2YBOh3Q4QSqXxCFhNEIyDdKv33z0vrJ33kWqCkc8RuJ1KJeRdVX8fX9f+AJs7iLPtLA+8ixkOVK5jHzurCAZbt4T04ReD8Zj8js3xcFFV5FmZ9HnG2QTn3x/j3Tgi/XFdEqepo8+j2zp5IcHZO++B2+/irQsOAhKqw6KRv72G+JQdONtobXvHx+iXBcWFgDIj44gSYgOR0jrZ4g291Eck/SoT3I4EJ/R90VnPxyiFGxxMNneQbr6OBwdkScJ+XgsiJiTCfnRoThEjEZItkN8d5v47ffB8f4cj6u/fvTDEbNOi1E0xVAMwjSkHw6oGMKv+hOnXuZMZYVhNEaWZDzDRZYktkYPKRo2lmrgGS5JlpJmmdjj795ClVU8w+X1/Xc5XVkmzoSn/jAa0/H73B9so0gKw3CMJmsUNBdTNWg5NSaRjx8HGIpOkIb4iU+YhuyOD1mrVDAVE08vUDZK9MMh19tXCZKQaRxQsYpMkgn7031kSeJG9y5hGjIIR7x58DVczcFSRVEumQXCNGTe83jr8H1G0QRbtXlzf4tFr8lzC2dxNZeKWeZ8dQNTMRjFQlJ4s38XQzHohX0KmstSqUTDqlG3KkiSRCfoMoqmDKMRh/4RiiQTppEIl1JURtGInckeb3Xe4kJ1nZyclfI8YRryhRt/xIxb5/7oPpIkcbu3TZyltJwaVavE3vHhS0JiGI651jrLwbTPqVKVw+mAcTQlPg658ZMpk9hHk2V0WWcYTniq/TjjeIyhGMRZjCar7I2PuDd8QMUssTM+ZNadYXvU495gB0e1v8136Z+NIIgpOTqjIEbTZKbTiKOjiciRn8ZcX6sy1xBF+P7ukDDJ+Mp7e+x0p5RK1rHjmoEqS4RxRq1mc/vGPlv7Y1xT4817XRbqLuNxRKczRddFtshwGv9/Crqlq7imxmzNIUkzpmGCogjLZj/KKNg6k0mMYSjYhoqpSmy0bLrdKQVL42gY8ODhgJKjczDwmUQZg2nE+ztD9kYxuiZzc2+CZ+t0u0KvHoYp2/tjZFni/Yd9bEMlTjI6PZ9m0+XyRgNNlWnWHb732QWyXMTujqYxqiJzaa1GmuecajikiZgCLC2VmGu43DkYM7dYRZUlbu+NjrNNREcdxxk9P+H23pDdns/5U1UyckxTo+qZ3N8doikyFUvF0RX2usJNMMtzVFni6w969CbhIyvcxfkivh9TbVUJgoTe5N9H1R4MfA4OxpiWTtUzGY8j1udK3D+aUrY1hkFKsyhIkWmaEwYxwTRgYbHMaBQJHob5zTt66Vsx9U5wghOc4AQnOMF/2PgzDXNOcIITnOAEJzjBf7g4KfQnOMEJTnCCE/wNxkmhP8EJTnCCE5zgbzBOCv0JTnCCE5zgBH+DcVLoT3CCE5zgBCf4G4yTQn+CE5zgBCc4wd9g/D8bhgzDKLnQzgAAAABJRU5ErkJggg==\n", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - } - ], - "source": [ - "im = image2tensor(Image.open('images/grizzly.jpg'))\n", - "_,axs = subplots(1,3)\n", - "for bear,ax,color in zip(im,axs,('Reds','Greens','Blues')):\n", - " show_image(255-bear, ax=ax, cmap=color)" - ] - }, { "cell_type": "code", "execution_count": null, @@ -1558,7 +1252,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Training a model" + "### Training a Model" ] }, { @@ -1814,21 +1508,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "1. how could multi-label classification improve the usability of the bear classifier?\n", + "1. How could multi-label classification improve the usability of the bear classifier?\n", "1. How do we encode the dependent variable in a multi-label classification problem?\n", "1. How do you access the rows and columns of a DataFrame as if it was a matrix?\n", "1. How do you get a column by name from a DataFrame?\n", - "1. What is the difference between a dataset and DataLoader?\n", - "1. What does a Datasets object normally contain?\n", - "1. What does a DataLoaders object normally contain?\n", - "1. What does lambda do in Python?\n", - "1. What are the methods to customise how the independent and dependent variables are created with the data block API?\n", + "1. What is the difference between a `Dataset` and `DataLoader`?\n", + "1. What does a `Datasets` object normally contain?\n", + "1. What does a `DataLoaders` object normally contain?\n", + "1. What does `lambda` do in Python?\n", + "1. What are the methods to customize how the independent and dependent variables are created with the data block API?\n", "1. Why is softmax not an appropriate output activation function when using a one hot encoded target?\n", - "1. Why is nll_loss not an appropriate loss function when using a one hot encoded target?\n", + "1. Why is `nll_loss` not an appropriate loss function when using a one-hot-encoded target?\n", "1. What is the difference between `nn.BCELoss` and `nn.BCEWithLogitsLoss`?\n", "1. Why can't we use regular accuracy in a multi-label problem?\n", - "1. When is it okay to tune an hyper-parameter on the validation set?\n", - "1. How is `y_range` implemented in fastai? (See if you can implement it yourself and test it without peaking!)\n", + "1. When is it okay to tune a hyperparameter on the validation set?\n", + "1. How is `y_range` implemented in fastai? (See if you can implement it yourself and test it without peeking!)\n", "1. What is a regression problem? What loss function should you use for such a problem?\n", "1. What do you need to do to make sure the fastai library applies the same data augmentation to your inputs images and your target point coordinates?" ] @@ -1837,15 +1531,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "1. Read a tutorial about pandas DataFrames and experiment with a few methods that look interesting to you. Have a look at the book website for recommended tutorials.\n", - "1. Retrain the bear classifier using multi-label classification. See if you can make it work effectively with images that don't contain any bears, including showing that information in the web application. Try an image with two different kinds of bears. Check whether the accuracy on the single label dataset is impacted using multi-label classification." + "1. Read a tutorial about Pandas DataFrames and experiment with a few methods that look interesting to you. See the book's website for recommended tutorials.\n", + "1. Retrain the bear classifier using multi-label classification. See if you can make it work effectively with images that don't contain any bears, including showing that information in the web application. Try an image with two different kinds of bears. Check whether the accuracy on the single-label dataset is impacted using multi-label classification." ] }, { diff --git a/clean/07_sizing_and_tta.ipynb b/clean/07_sizing_and_tta.ipynb index 8d23a6151..e1af7fe23 100644 --- a/clean/07_sizing_and_tta.ipynb +++ b/clean/07_sizing_and_tta.ipynb @@ -14,7 +14,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Training a state-of-the-art model" + "# Training a State-of-the-Art Model" ] }, { @@ -270,7 +270,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Progressive resizing" + "## Progressive Resizing" ] }, { @@ -443,7 +443,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Test time augmentation" + "## Test Time Augmentation" ] }, { @@ -528,7 +528,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Papers and math" + "### Sidebar: Papers and Math" ] }, { @@ -576,14 +576,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Label smoothing" + "## Label Smoothing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: Label smoothing, the paper" + "### Sidebar: Label Smoothing, the Paper" ] }, { @@ -620,23 +620,23 @@ "1. Is using TTA at inference slower or faster than regular inference? Why?\n", "1. What is Mixup? How do you use it in fastai?\n", "1. Why does Mixup prevent the model from being too confident?\n", - "1. Why does a training with Mixup for 5 epochs end up worse than a training without Mixup?\n", + "1. Why does training with Mixup for five epochs end up worse than training without Mixup?\n", "1. What is the idea behind label smoothing?\n", "1. What problems in your data can label smoothing help with?\n", - "1. When using label smoothing with 5 categories, what is the target associated with the index 1?\n", - "1. What is the first step to take when you want to prototype quick experiments on a new dataset." + "1. When using label smoothing with five categories, what is the target associated with the index 1?\n", + "1. What is the first step to take when you want to prototype quick experiments on a new dataset?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research\n", + "### Further Research\n", "\n", - "1. Use the fastai documentation to build a function that crops an image to a square in the four corners, then implement a TTA method that averages the predictions on a center crop and those four crops. Did it help? Is it better than the TTA method of fastai?\n", - "1. Find the Mixup paper on arxiv and read it. Pick one or two more recent articles introducing variants of Mixup and read them, then try to implement them on your problem.\n", - "1. Find the script training Imagenette using Mixup and use it as an example to build a script for a long training on your own project. Execute it and see if it helped.\n", - "1. Read the sidebar on the math of label smoothing, and look at the relevant section of the original paper, and see if you can follow it. Don't be afraid to ask for help!" + "1. Use the fastai documentation to build a function that crops an image to a square in each of the four corners, then implement a TTA method that averages the predictions on a center crop and those four crops. Did it help? Is it better than the TTA method of fastai?\n", + "1. Find the Mixup paper on arXiv and read it. Pick one or two more recent articles introducing variants of Mixup and read them, then try to implement them on your problem.\n", + "1. Find the script training Imagenette using Mixup and use it as an example to build a script for a long training on your own project. Execute it and see if it helps.\n", + "1. Read the sidebar \"Label Smoothing, the Paper\", look at the relevant section of the original paper and see if you can follow it. Don't be afraid to ask for help!" ] }, { diff --git a/clean/08_collab.ipynb b/clean/08_collab.ipynb index 647dcfd1a..9a54e8821 100644 --- a/clean/08_collab.ipynb +++ b/clean/08_collab.ipynb @@ -14,14 +14,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Collaborative filtering deep dive" + "# Collaborative Filtering Deep Dive" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## A first look at the data" + "## A First Look at the Data" ] }, { @@ -198,7 +198,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Learning the latent factors" + "## Learning the Latent Factors" ] }, { @@ -587,7 +587,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Collaborative filtering from scratch" + "## Collaborative Filtering from Scratch" ] }, { @@ -907,7 +907,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Weight decay" + "### Weight Decay" ] }, { @@ -1009,7 +1009,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating our own Embedding module" + "### Creating Our Own Embedding Module" ] }, { @@ -1207,7 +1207,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Interpreting embeddings and biases" + "## Interpreting Embeddings and Biases" ] }, { @@ -1433,7 +1433,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Embedding distance" + "### Embedding Distance" ] }, { @@ -1464,14 +1464,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Boot strapping a collaborative filtering model" + "## Boot Strapping a Collaborative Filtering Model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Deep learning for collaborative filtering" + "## Deep Learning for Collaborative Filtering" ] }, { @@ -1670,7 +1670,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: kwargs and delegates" + "### Sidebar: Kwargs and Delegates" ] }, { @@ -1735,7 +1735,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research\n", + "### Further Research\n", "\n", "1. Take a look at all the differences between the `Embedding` version of `DotProductBias` and the `create_params` version, and try to understand why each of those changes is required. If you're not sure, try reverting each change, to see what happens. (NB: even the type of brackets used in `forward` has changed!)\n", "1. Find three other areas where collaborative filtering is being used, and find out what pros and cons of this approach in those areas.\n", diff --git a/clean/09_tabular.ipynb b/clean/09_tabular.ipynb index b96b4e9e3..2a76d5d61 100644 --- a/clean/09_tabular.ipynb +++ b/clean/09_tabular.ipynb @@ -4,7 +4,7 @@ "cell_type": "code", "execution_count": null, "metadata": { - "hide_input": true + "hide_input": false }, "outputs": [ { @@ -34,28 +34,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Tabular modelling deep dive" + "# Tabular Modeling Deep Dive" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Categorical embeddings" + "## Categorical Embeddings" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Beyond deep learning" + "## Beyond Deep Learning" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## The dataset" + "## The Dataset" ] }, { @@ -147,7 +147,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Look at the data" + "### Look at the Data" ] }, { @@ -253,14 +253,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Decision trees" + "## Decision Trees" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Handling dates" + "### Handling Dates" ] }, { @@ -945,7 +945,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating the decision tree" + "### Creating the Decision Tree" ] }, { @@ -6841,14 +6841,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Categorical variables" + "### Categorical Variables" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Random forests" + "## Random Forests" ] }, { @@ -6865,7 +6865,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating a random forest" + "### Creating a Random Forest" ] }, { @@ -6965,7 +6965,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Out-of-bag error" + "### Out-of-Bag Error" ] }, { @@ -6992,14 +6992,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Model interpretation" + "## Model Interpretation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Tree variance for prediction confidence" + "### Tree Variance for Prediction Confidence" ] }, { @@ -7064,7 +7064,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Feature importance" + "### Feature Importance" ] }, { @@ -7216,7 +7216,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Removing low-importance variables" + "### Removing Low-Importance Variables" ] }, { @@ -7325,7 +7325,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Removing redundant features" + "### Removing Redundant Features" ] }, { @@ -7490,7 +7490,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Partial dependence" + "### Partial Dependence" ] }, { @@ -7569,14 +7569,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Data leakage" + "### Data Leakage" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Tree interpreter" + "### Tree Interpreter" ] }, { @@ -7658,14 +7658,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Extrapolation and neural networks" + "## Extrapolation and Neural Networks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### The extrapolation problem" + "### The Extrapolation Problem" ] }, { @@ -7779,7 +7779,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Finding out of domain data" + "### Finding out of Domain Data" ] }, { @@ -7978,7 +7978,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Using a neural network" + "### Using a Neural Network" ] }, { @@ -8297,7 +8297,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Sidebar: fastai's Tabular classes" + "### Sidebar: fastai's Tabular Classes" ] }, { @@ -8355,14 +8355,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Combining embeddings with other methods" + "### Combining Embeddings with Other Methods" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Conclusion: our advice for tabular modeling" + "## Conclusion: Our Advice for Tabular Modeling" ] }, { @@ -8415,7 +8415,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/10_nlp.ipynb b/clean/10_nlp.ipynb index 92b3717e6..1b8183ee5 100644 --- a/clean/10_nlp.ipynb +++ b/clean/10_nlp.ipynb @@ -15,14 +15,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# NLP deep dive: RNNs" + "# NLP Deep Dive: RNNs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Text preprocessing" + "## Text Preprocessing" ] }, { @@ -36,7 +36,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Word tokenization with fastai" + "### Word Tokenization with fastai" ] }, { @@ -186,7 +186,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Subword tokenization" + "### Subword Tokenization" ] }, { @@ -412,7 +412,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Putting our texts into batches for a language model" + "### Putting Our Texts Into Batches for a Language Model" ] }, { @@ -849,14 +849,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Training a text classifier" + "## Training a Text Classifier" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Language model using DataBlock" + "### Language Model Using DataBlock" ] }, { @@ -919,7 +919,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Fine tuning the language model" + "### Fine Tuning the Language Model" ] }, { @@ -980,7 +980,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Saving and loading models" + "### Saving and Loading Models" ] }, { @@ -1130,7 +1130,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Text generation" + "### Text Generation" ] }, { @@ -1189,7 +1189,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating the classifier DataLoaders" + "### Creating the Classifier DataLoaders" ] }, { @@ -1305,7 +1305,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Fine tuning the classifier" + "### Fine Tuning the Classifier" ] }, { @@ -1486,7 +1486,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Disinformation and language models" + "## Disinformation and Language Models" ] }, { @@ -1535,7 +1535,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/11_midlevel_data.ipynb b/clean/11_midlevel_data.ipynb index 3c40c1f64..110ed0e89 100644 --- a/clean/11_midlevel_data.ipynb +++ b/clean/11_midlevel_data.ipynb @@ -15,14 +15,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Data munging with fastai's mid-level API" + "# Data Munging With fastai's mid-Level API" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Going deeper into fastai's layered API" + "## Going Deeper into fastai's Layered API" ] }, { @@ -179,7 +179,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Writing your own Transform" + "### Writing Your Own Transform" ] }, { @@ -315,7 +315,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## TfmdLists and Datasets: Transformed collections" + "## TfmdLists and Datasets: Transformed Collections" ] }, { @@ -599,7 +599,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Applying the mid-tier data API: SiamesePair" + "## Applying the mid-Tier Data API: SiamesePair" ] }, { @@ -836,7 +836,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { @@ -851,7 +851,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Becoming a deep learning practitioner" + "## Becoming a Deep Learning Practitioner" ] }, { diff --git a/clean/12_nlp_dive.ipynb b/clean/12_nlp_dive.ipynb index 6e94357b1..c7020788f 100644 --- a/clean/12_nlp_dive.ipynb +++ b/clean/12_nlp_dive.ipynb @@ -14,14 +14,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# A language model from scratch" + "# A Language Model from Scratch" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## The data" + "## The Data" ] }, { @@ -176,7 +176,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Our first language model from scratch" + "## Our First Language Model from Scratch" ] }, { @@ -235,7 +235,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Our language model in PyTorch" + "### Our Language Model in PyTorch" ] }, { @@ -352,7 +352,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Our first recurrent neural network" + "### Our First Recurrent Neural Network" ] }, { @@ -450,7 +450,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Maintaining the state of an RNN" + "### Maintaining the State of an RNN" ] }, { @@ -634,7 +634,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating more signal" + "### Creating More Signal" ] }, { @@ -860,7 +860,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The model" + "## The Model" ] }, { @@ -1030,7 +1030,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Exploding or disappearing activations" + "### Exploding or Disappearing Activations" ] }, { @@ -1044,7 +1044,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Building an LSTM from scratch" + "### Building an LSTM from Scratch" ] }, { @@ -1140,7 +1140,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Training a language model using LSTMs" + "### Training a Language Model Using LSTMs" ] }, { @@ -1339,14 +1339,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### AR and TAR regularization" + "### AR and TAR Regularization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Training a weight-tied regularized LSTM" + "### Training a Weight-Tied Regularized LSTM" ] }, { @@ -1597,7 +1597,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/13_convolutions.ipynb b/clean/13_convolutions.ipynb index e554616cb..54c321ca2 100644 --- a/clean/13_convolutions.ipynb +++ b/clean/13_convolutions.ipynb @@ -17,14 +17,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Convolutional neural networks" + "# Convolutional Neural Networks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## The magic of convolutions" + "## The Magic of Convolutions" ] }, { @@ -1253,7 +1253,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Mapping a convolution kernel" + "### Mapping a Convolution Kernel" ] }, { @@ -1479,21 +1479,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Strides and padding" + "### Strides and Padding" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Understanding the convolution equations" + "### Understanding the Convolution Equations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Our first convolutional neural network" + "## Our First Convolutional Neural Network" ] }, { @@ -1737,7 +1737,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Understanding convolution arithmetic" + "### Understanding Convolution Arithmetic" ] }, { @@ -1808,21 +1808,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Receptive fields" + "### Receptive Fields" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### A note about Twitter" + "### A Note about Twitter" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Colour images" + "## Colour Images" ] }, { @@ -1896,7 +1896,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Improving training stability" + "## Improving Training Stability" ] }, { @@ -1982,7 +1982,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A simple baseline" + "### A Simple Baseline" ] }, { @@ -2125,7 +2125,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Increase batch size" + "### Increase Batch Size" ] }, { @@ -2204,7 +2204,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1cycle training" + "### 1cycle Training" ] }, { @@ -2353,7 +2353,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Batch normalization" + "### Batch Normalization" ] }, { @@ -2634,7 +2634,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/14_resnet.ipynb b/clean/14_resnet.ipynb index 0a4c7ca5c..889bfb6f7 100644 --- a/clean/14_resnet.ipynb +++ b/clean/14_resnet.ipynb @@ -23,7 +23,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Going back to Imagenette" + "## Going Back to Imagenette" ] }, { @@ -230,14 +230,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Building a modern CNN: ResNet" + "## Building a Modern CNN: ResNet" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Skip-connections" + "### Skip-Connections" ] }, { @@ -446,7 +446,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A state-of-the-art ResNet" + "### A State-of-the-Art ResNet" ] }, { @@ -602,7 +602,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Bottleneck layers" + "### Bottleneck Layers" ] }, { @@ -856,7 +856,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/15_arch_details.ipynb b/clean/15_arch_details.ipynb index b30d6c4df..a9e6ad2b9 100644 --- a/clean/15_arch_details.ipynb +++ b/clean/15_arch_details.ipynb @@ -14,14 +14,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Application architectures deep dive" + "# Application Architectures Deep Dive" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Computer vision" + "## Computer Vision" ] }, { @@ -97,7 +97,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### A Siamese network" + "### A Siamese Network" ] }, { @@ -353,7 +353,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Natural language processing" + "## Natural Language Processing" ] }, { @@ -367,7 +367,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Wrapping up architectures" + "## Wrapping up Architectures" ] }, { @@ -405,7 +405,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/16_accel_sgd.ipynb b/clean/16_accel_sgd.ipynb index 5f91f6fa2..8634b0d90 100644 --- a/clean/16_accel_sgd.ipynb +++ b/clean/16_accel_sgd.ipynb @@ -16,14 +16,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# The training process" + "# The Training Process" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Let's start with SGD" + "## Let's Start with SGD" ] }, { @@ -229,7 +229,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## A generic optimizer" + "## A Generic Optimizer" ] }, { @@ -591,7 +591,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Decoupled weight_decay" + "## Decoupled Weight Decay" ] }, { @@ -605,7 +605,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating a callback" + "### Creating a Callback" ] }, { @@ -647,7 +647,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Callback ordering and exceptions" + "### Callback Ordering and Exceptions" ] }, { @@ -714,7 +714,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/17_foundations.ipynb b/clean/17_foundations.ipynb index 44623bfb7..54c5f5f3c 100644 --- a/clean/17_foundations.ipynb +++ b/clean/17_foundations.ipynb @@ -16,28 +16,28 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# A neural net from the foundations" + "# A Neural Net from the Foundations" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## A neural net layer from scratch" + "## A Neural Net Layer from Scratch" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Modeling a neuron" + "### Modeling a Neuron" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Matrix multiplication from scratch" + "### Matrix Multiplication from Scratch" ] }, { @@ -112,6 +112,13 @@ "%timeit -n 20 t2=m1@m2" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Elementwise Arithmetic" + ] + }, { "cell_type": "code", "execution_count": null, @@ -710,7 +717,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Einstein summation" + "### Einstein Summation" ] }, { @@ -743,14 +750,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## The forward and backward passes" + "## The Forward and Backward Passes" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Defining and initializing a layer" + "### Defining and Initializing a Layer" ] }, { @@ -1149,7 +1156,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Gradients and backward pass" + "### Gradients and Backward Pass" ] }, { @@ -1251,7 +1258,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Refactor the model" + "### Refactor the Model" ] }, { @@ -1573,7 +1580,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/18_CAM.ipynb b/clean/18_CAM.ipynb index 6cbccccb9..f93e6761c 100644 --- a/clean/18_CAM.ipynb +++ b/clean/18_CAM.ipynb @@ -16,14 +16,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# CNN interpretation with CAM" + "# CNN Interpretation with CAM" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## CAM and hooks" + "## CAM and Hooks" ] }, { @@ -450,7 +450,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/19_learner.ipynb b/clean/19_learner.ipynb index a61d6e8e7..b5ae9bd1f 100644 --- a/clean/19_learner.ipynb +++ b/clean/19_learner.ipynb @@ -14,7 +14,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# fastai Learner from scratch" + "# fastai Learner from Scratch" ] }, { @@ -1079,7 +1079,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Scheduling the learning rate" + "### Scheduling the Learning Rate" ] }, { @@ -1335,7 +1335,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Further research" + "### Further Research" ] }, { diff --git a/clean/20_conclusion.ipynb b/clean/20_conclusion.ipynb index 9465aae49..83783afd0 100644 --- a/clean/20_conclusion.ipynb +++ b/clean/20_conclusion.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Concluding thoughts" + "# Concluding Thoughts" ] }, { diff --git a/clean/app_blog.ipynb b/clean/app_blog.ipynb index 2cda23718..ab54fa2bc 100644 --- a/clean/app_blog.ipynb +++ b/clean/app_blog.ipynb @@ -15,7 +15,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Creating a blog" + "# Creating a Blog" ] }, { @@ -29,35 +29,35 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating the repository" + "### Creating the Repository" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Setting up your homepage" + "### Setting up Your Homepage" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Creating posts" + "### Creating Posts" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Synchronizing GitHub and your computer" + "### Synchronizing GitHub and Your Computer" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Jupyter for blogging" + "### Jupyter for Blogging" ] }, {