diff --git a/docs/src/tutorials/advanced_search.ipynb b/docs/src/tutorials/advanced_search.ipynb index d0ad878..29dba13 100644 --- a/docs/src/tutorials/advanced_search.ipynb +++ b/docs/src/tutorials/advanced_search.ipynb @@ -11,7 +11,7 @@ }, { "cell_type": "code", - "execution_count": 169, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -115,7 +115,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "There is also another search method called `search_best` which return both the solution and the possible error. The method returns the best program found so far. In this case, we can also see the error (`typemax(Int)`):" + "There is another search method called `search_best` which returns both the solution and the possible error. The method returns the best program found so far. In this case, we can also see the error (`typemax(Int)`):" ] }, { @@ -135,11 +135,11 @@ "source": [ "## Search methods\n", "\n", - "We now show examples of using different search procedures, which are initialized by using different enumerators that are passed to the search function.\n", + "We now show examples of using different search procedures, which are initialized by passing different enumerators to the search function.\n", "\n", "### Breadth-First Search\n", "\n", - "The breadth-first search will first enumerate all possible programs at the same depth before considering a program with a depth of one more. A tree of the grammar is returned with programs ordered in increasing sizes. We can first `collect` the programs that have a `max-depth` of 2 and a `max_size` of infinite (integer maximum value), where the starting symbol is of type `Real`. This function uses a default heuristic 'left-most first', such that the left-most child in the tree is always explored first." + "The breadth-first search will first enumerate all possible programs at the same depth before considering programs with a depth of one more. A tree of the grammar is returned with programs ordered in increasing sizes. We can first `collect` the programs that have a `max-depth` of 2 and a `max_size` of infinite (integer maximum value), where the starting symbol is of type `Real`. This function uses a default heuristic 'left-most first', such that the left-most child in the tree is always explored first." ] }, { @@ -159,7 +159,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can test that this function returns the correct functions and all functions. " + "We can test that this function returns all and only the correct functions. " ] }, { @@ -267,7 +267,7 @@ "\n", "One of the stochastic search methods that is implemented is Metropolis-Hastings (MH), which samples from a distribution of programs based on the grammar. For more information on MH, see for example [this webpage](https://stephens999.github.io/fiveMinuteStats/MH_intro.html).\n", "\n", - "The below example uses a simple arithmetic example. You can try running this code block multiple times, which will give different programs, as the search is stochastic. " + "The example below uses a simple arithmetic example. You can try running this code block multiple times, which will give different programs, as the search is stochastic. " ] }, { @@ -276,8 +276,8 @@ "metadata": {}, "outputs": [], "source": [ - "e = Meta.parse(\"x -> x * x + 4\")\n", - "problem, examples = create_problem(eval(e))\n", + "e = x -> x * x + 4\n", + "problem, examples = create_problem(e)\n", "enumerator = get_mh_enumerator(examples, mean_squared_error)\n", "program, cost = search_best(grammar, problem, :X, enumerator=enumerator, error_function=mse_error_function, max_depth=3)" ] @@ -288,9 +288,9 @@ "source": [ "### Very Large Scale Neighbourhood Search \n", "\n", - "The second implemented stochastic search method is VLSN, which search for a local optimum in the neighbourhood. For more information, see [this article](https://backend.orbit.dtu.dk/ws/portalfiles/portal/5293785/Pisinger.pdf).\n", + "The second implemented stochastic search method is VLSN, which searches for a local optimum in the neighbourhood. For more information, see [this article](https://backend.orbit.dtu.dk/ws/portalfiles/portal/5293785/Pisinger.pdf).\n", "\n", - "Given the same grammar as before, we can try with some simple examples." + "Given the same grammar as before, we can try it with some simple examples." ] }, { @@ -299,9 +299,9 @@ "metadata": {}, "outputs": [], "source": [ - "e = Meta.parse(\"x -> 10\")\n", + "e = x -> 10\n", "max_depth = 2\n", - "problem, examples = create_problem(eval(e))\n", + "problem, examples = create_problem(e)\n", "enumerator = get_vlsn_enumerator(examples, mean_squared_error, max_depth)\n", "program, cost = search_best(grammar, problem, :X, enumerator=enumerator, error_function=mse_error_function, max_depth=max_depth)\n" ] @@ -312,9 +312,9 @@ "metadata": {}, "outputs": [], "source": [ - "e = Meta.parse(\"x -> x\")\n", + "e = x -> x\n", "max_depth = 1\n", - "problem, examples = create_problem(eval(e))\n", + "problem, examples = create_problem(e)\n", "enumerator = get_vlsn_enumerator(examples, mean_squared_error, max_depth)\n", "program, cost = search_best(grammar, problem, :X, enumerator=enumerator, error_function=mse_error_function, max_depth=max_depth)" ] @@ -325,7 +325,7 @@ "source": [ "### Simulated Annealing\n", "\n", - "The third stochastic search method is called simulated annealing, is another hill-climbing method to find local optima. For more information, see [this page](https://www.cs.cmu.edu/afs/cs.cmu.edu/project/learn-43/lib/photoz/.g/web/glossary/anneal.html).\n", + "The third stochastic search method is called simulated annealing. This is another hill-climbing method to find local optima. For more information, see [this page](https://www.cs.cmu.edu/afs/cs.cmu.edu/project/learn-43/lib/photoz/.g/web/glossary/anneal.html).\n", "\n", "We try the example from earlier, but now we can additionally define the `initial_temperature` of the algorithm, which is 1 by default. Change the value below to see the effect." ] @@ -336,9 +336,9 @@ "metadata": {}, "outputs": [], "source": [ - "e = Meta.parse(\"x -> x * x + 4\")\n", + "e = x -> x * x + 4\n", "initial_temperature = 1\n", - "problem, examples = create_problem(eval(e))\n", + "problem, examples = create_problem(e)\n", "enumerator = get_sa_enumerator(examples, mean_squared_error, initial_temperature)\n", "program, cost = search_best(grammar, problem, :X, enumerator=enumerator, error_function=mse_error_function, max_depth=3) " ] @@ -349,9 +349,9 @@ "metadata": {}, "outputs": [], "source": [ - "e = Meta.parse(\"x -> x * x + 4\")\n", + "e = x -> x * x + 4\n", "initial_temperature = 2\n", - "problem, examples = create_problem(eval(e))\n", + "problem, examples = create_problem(e)\n", "enumerator = get_sa_enumerator(examples, mean_squared_error, initial_temperature)\n", "program, cost = @time search_best(grammar, problem, :X, enumerator=enumerator, error_function=mse_error_function, max_depth=3)" ] @@ -373,8 +373,8 @@ "metadata": {}, "outputs": [], "source": [ - "e = Meta.parse(\"x -> 3 * x * x + (x + 2)\")\n", - "problem, examples = create_problem(eval(e))\n", + "e = x -> 3 * x * x + (x + 2)\n", + "problem, examples = create_problem(e)\n", "enumerator = get_genetic_enumerator(examples, \n", " initial_population_size = 10,\n", " mutation_probability = 0.8,\n", @@ -559,7 +559,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 29, "metadata": {}, "outputs": [], "source": [ @@ -600,7 +600,7 @@ ], "metadata": { "kernelspec": { - "display_name": "Julia 1.9.4", + "display_name": "Julia 1.9.0", "language": "julia", "name": "julia-1.9" }, @@ -608,7 +608,7 @@ "file_extension": ".jl", "mimetype": "application/julia", "name": "julia", - "version": "1.9.4" + "version": "1.9.0" } }, "nbformat": 4,