Skip to content

Commit

Permalink
Render bookdown
Browse files Browse the repository at this point in the history
  • Loading branch information
jhudsl-robot committed Nov 7, 2023
1 parent b9c2b7f commit 12b483f
Show file tree
Hide file tree
Showing 5 changed files with 12 additions and 9 deletions.
4 changes: 3 additions & 1 deletion docs/01-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,9 @@ We must always be aware of the potential for harm and deliberately take steps to

Humans have been interacting with AI chatbots for years. In fact, Alan Turing is credited with coming up with the concept for chatbots as early as 1950. Chatbots are software-based systems that interact with humans typically by text or speech inputs, rather than code. They mimic some human activity [@wikipedia_chatbot_2023; @abdulla2022chatbots] based on these language inputs. They process the inputs using natural language processing commonly abbreviated as NLP. NLP is a kind of AI that uses human text or speech and parses the language to determine structures and patterns to extract meaning. NLP uses large amounts of language data (such as books, websites etc.) to train AI systems to identify these structures and patterns. For example, the AI model might identify when a sentence is a question or a statement by examining various features in a prompt such as the inclusion of a question mark of the use of words often used in questions [@wikipedia_natural_2023; @cahn2017chatbot].

The methods used for chatbots have evolved over time. Now chatbots often utilize AI methods like [deep learning](https://en.wikipedia.org/wiki/Deep_learning) (which involve multiple layers of abstractions of the input data [@wikipedia_deep_learning_2023]) to extract meaning from the language data [@wikipedia_natural_2023]. As these methods use large quantities of text, they are therefore often called large language models [@wikipedia_large_language_2023].
The methods used for chatbots have evolved over time. Now chatbots often utilize AI methods like [deep learning](https://en.wikipedia.org/wiki/Deep_learning) (which involve multiple layers of abstractions of the input data [@wikipedia_deep_learning_2023]) to extract meaning from the language data [@wikipedia_natural_2023]. As these methods use large quantities of text, they are therefore often called large language models, or LLMs [@wikipedia_large_language_2023].

Although it might _seem_ like LLMs are talking to you when you interact with them, it's important to remember they aren't actually thinking. Instead, LLMs are simply putting together tokens, or parts of words, based on a huge distance matrix created using an LLM's training data set. Essentially, an LLM's program figures out how frequently (and in what contexts) different words show up together in the training data. For example, the word "example" is often paired with the word "for" in the text for this course. An LLM trained on this course would then be more likely to create the phrase "for example" than the phrase "for apples", as the training data includes multiple instances of the first phrase but only one instance of the second. (To be precise, the LLM would predict the tokens "ex", "am", and "ple", but we see it as the word "example".) If you're interested in learning more, check out this excellent [visual article](https://ig.ft.com/generative-ai/) by the Financial Times (we are not affiliated with them).

Despite the fact that chatbots have been around awhile, the popularity of OpenAI's ChatGPT and DALL-E programs has sparked a recent surge of interest. These chatbots are in part particularly powerful due to the fact that large amounts of computing power were used to train their NLP models on very large datasets [@caldarini2022literature; @cahn2017chatbot]. Large language model AIs can be divided into two categories: those that can be reached using an internet browser, and those that can be reached using an integrated development environment (IDE).

Expand Down
6 changes: 3 additions & 3 deletions docs/04-refactoring.md
Original file line number Diff line number Diff line change
Expand Up @@ -553,7 +553,7 @@ proc.time() - start_time

```
## user system elapsed
## 8.933 0.003 8.935
## 11.963 0.004 11.966
```

:::{.query}
Expand Down Expand Up @@ -581,7 +581,7 @@ proc.time() - start_time

```
## user system elapsed
## 0.775 0.564 0.625
## 0.644 0.304 0.645
```

The `outer()` function performs the same calculation as the nested loop in the original code, but more efficiently. It returns a matrix of all possible combinations of x and y values, with each element of the matrix being the product of the corresponding x and y values. The `rowSums()` function is then used to sum the elements of each row of the matrix, which is equivalent to summing the products of x and y for each index `i` in the original loop. This method avoids the need for the nested loop, resulting in a faster and more efficient computation.
Expand Down Expand Up @@ -609,7 +609,7 @@ proc.time() - start_time

```
## user system elapsed
## 0.361 0.299 0.263
## 0.303 0.264 0.361
```

One optimized way to perform the same calculation is by using the `%*%` operator to perform matrix multiplication. This can be done by converting x and y to matrices and transposing one of them so that their dimensions align for matrix multiplication. This code should be much faster than the original implementation because it takes advantage of highly optimized matrix multiplication algorithms in R.
Expand Down
3 changes: 2 additions & 1 deletion docs/introduction.html
Original file line number Diff line number Diff line change
Expand Up @@ -434,7 +434,8 @@ <h2><span class="header-section-number">1.3</span> AI Code of Ethics</h2>
<div id="the-ai-chatbots" class="section level2" number="1.4">
<h2><span class="header-section-number">1.4</span> The AI Chatbots</h2>
<p>Humans have been interacting with AI chatbots for years. In fact, Alan Turing is credited with coming up with the concept for chatbots as early as 1950. Chatbots are software-based systems that interact with humans typically by text or speech inputs, rather than code. They mimic some human activity <span class="citation">(<a href="references.html#ref-wikipedia_chatbot_2023" role="doc-biblioref"><span>“Chatbot”</span> 2023</a>; <a href="references.html#ref-abdulla2022chatbots" role="doc-biblioref">Abdulla et al. 2022</a>)</span> based on these language inputs. They process the inputs using natural language processing commonly abbreviated as NLP. NLP is a kind of AI that uses human text or speech and parses the language to determine structures and patterns to extract meaning. NLP uses large amounts of language data (such as books, websites etc.) to train AI systems to identify these structures and patterns. For example, the AI model might identify when a sentence is a question or a statement by examining various features in a prompt such as the inclusion of a question mark of the use of words often used in questions <span class="citation">(<a href="references.html#ref-wikipedia_natural_2023" role="doc-biblioref"><span>“Natural Language Processing”</span> 2023</a>; <a href="references.html#ref-cahn2017chatbot" role="doc-biblioref">Cahn 2017</a>)</span>.</p>
<p>The methods used for chatbots have evolved over time. Now chatbots often utilize AI methods like <a href="https://en.wikipedia.org/wiki/Deep_learning">deep learning</a> (which involve multiple layers of abstractions of the input data <span class="citation">(<a href="references.html#ref-wikipedia_deep_learning_2023" role="doc-biblioref"><span>“Deep Learning”</span> 2023</a>)</span>) to extract meaning from the language data <span class="citation">(<a href="references.html#ref-wikipedia_natural_2023" role="doc-biblioref"><span>“Natural Language Processing”</span> 2023</a>)</span>. As these methods use large quantities of text, they are therefore often called large language models <span class="citation">(<a href="references.html#ref-wikipedia_large_language_2023" role="doc-biblioref"><span>“Large Language Model”</span> 2023</a>)</span>.</p>
<p>The methods used for chatbots have evolved over time. Now chatbots often utilize AI methods like <a href="https://en.wikipedia.org/wiki/Deep_learning">deep learning</a> (which involve multiple layers of abstractions of the input data <span class="citation">(<a href="references.html#ref-wikipedia_deep_learning_2023" role="doc-biblioref"><span>“Deep Learning”</span> 2023</a>)</span>) to extract meaning from the language data <span class="citation">(<a href="references.html#ref-wikipedia_natural_2023" role="doc-biblioref"><span>“Natural Language Processing”</span> 2023</a>)</span>. As these methods use large quantities of text, they are therefore often called large language models, or LLMs <span class="citation">(<a href="references.html#ref-wikipedia_large_language_2023" role="doc-biblioref"><span>“Large Language Model”</span> 2023</a>)</span>.</p>
<p>Although it might <em>seem</em> like LLMs are talking to you when you interact with them, it’s important to remember they aren’t actually thinking. Instead, LLMs are simply putting together tokens, or parts of words, based on a huge distance matrix created using an LLM’s training data set. Essentially, an LLM’s program figures out how frequently (and in what contexts) different words show up together in the training data. For example, the word “example” is often paired with the word “for” in the text for this course. An LLM trained on this course would then be more likely to create the phrase “for example” than the phrase “for apples”, as the training data includes multiple instances of the first phrase but only one instance of the second. (To be precise, the LLM would predict the tokens “ex”, “am”, and “ple”, but we see it as the word “example”.) If you’re interested in learning more, check out this excellent <a href="https://ig.ft.com/generative-ai/">visual article</a> by the Financial Times (we are not affiliated with them).</p>
<p>Despite the fact that chatbots have been around awhile, the popularity of OpenAI’s ChatGPT and DALL-E programs has sparked a recent surge of interest. These chatbots are in part particularly powerful due to the fact that large amounts of computing power were used to train their NLP models on very large datasets <span class="citation">(<a href="references.html#ref-caldarini2022literature" role="doc-biblioref">Caldarini, Jaf, and McGarry 2022</a>; <a href="references.html#ref-cahn2017chatbot" role="doc-biblioref">Cahn 2017</a>)</span>. Large language model AIs can be divided into two categories: those that can be reached using an internet browser, and those that can be reached using an integrated development environment (IDE).</p>
<div class="warning">
<p>The information presented in this course is meant for use with open source code and software. It is unclear what happens to the information fed to AI chatbots as prompts, or how secure the data are. We know data are saved and may be used to further train the AI tools, but the specifics of how data are saved, as well as how sensitive or personally identifiable information are protected, is unknown.</p>
Expand Down
6 changes: 3 additions & 3 deletions docs/refactoring-code.html
Original file line number Diff line number Diff line change
Expand Up @@ -770,7 +770,7 @@ <h2><span class="header-section-number">5.10</span> Code optimization</h2>
<span id="cb51-16"><a href="refactoring-code.html#cb51-16" aria-hidden="true" tabindex="-1"></a><span class="co"># End timer</span></span>
<span id="cb51-17"><a href="refactoring-code.html#cb51-17" aria-hidden="true" tabindex="-1"></a><span class="fu">proc.time</span>() <span class="sc">-</span> start_time</span></code></pre></div>
<pre><code>## user system elapsed
## 8.933 0.003 8.935</code></pre>
## 11.963 0.004 11.966</code></pre>
<div class="query">
<p>What is a faster, more optimized way of running the following R code?</p>
</div>
Expand All @@ -789,7 +789,7 @@ <h2><span class="header-section-number">5.10</span> Code optimization</h2>
<span id="cb53-12"><a href="refactoring-code.html#cb53-12" aria-hidden="true" tabindex="-1"></a><span class="co"># End timer</span></span>
<span id="cb53-13"><a href="refactoring-code.html#cb53-13" aria-hidden="true" tabindex="-1"></a><span class="fu">proc.time</span>() <span class="sc">-</span> start_time</span></code></pre></div>
<pre><code>## user system elapsed
## 0.775 0.564 0.625</code></pre>
## 0.644 0.304 0.645</code></pre>
<p>The <code>outer()</code> function performs the same calculation as the nested loop in the original code, but more efficiently. It returns a matrix of all possible combinations of x and y values, with each element of the matrix being the product of the corresponding x and y values. The <code>rowSums()</code> function is then used to sum the elements of each row of the matrix, which is equivalent to summing the products of x and y for each index <code>i</code> in the original loop. This method avoids the need for the nested loop, resulting in a faster and more efficient computation.</p>
</div>
<p>So much faster! We can look at another option by regenerating the response:</p>
Expand All @@ -808,7 +808,7 @@ <h2><span class="header-section-number">5.10</span> Code optimization</h2>
<span id="cb55-12"><a href="refactoring-code.html#cb55-12" aria-hidden="true" tabindex="-1"></a><span class="co"># End timer</span></span>
<span id="cb55-13"><a href="refactoring-code.html#cb55-13" aria-hidden="true" tabindex="-1"></a><span class="fu">proc.time</span>() <span class="sc">-</span> start_time</span></code></pre></div>
<pre><code>## user system elapsed
## 0.361 0.299 0.263</code></pre>
## 0.303 0.264 0.361</code></pre>
<p>One optimized way to perform the same calculation is by using the <code>%*%</code> operator to perform matrix multiplication. This can be done by converting x and y to matrices and transposing one of them so that their dimensions align for matrix multiplication. This code should be much faster than the original implementation because it takes advantage of highly optimized matrix multiplication algorithms in R.</p>
</div>
<p>While this second suggestion is faster, you will need to consider what aspects of the codebase are most important in each instance. For example, this code runs more quickly, but <a href="https://stat.ethz.ch/R-manual/R-patched/library/base/html/matmult.html">the <code>%*%</code> operator</a> might be unfamiliar to some R programmers. In cases where efficiency is less important, or the data are not large, you might consider maximizing readability.</p>
Expand Down
2 changes: 1 addition & 1 deletion docs/search_index.json

Large diffs are not rendered by default.

0 comments on commit 12b483f

Please sign in to comment.