From 4759b7772f2195df88c17787dd2e43b8879edcb5 Mon Sep 17 00:00:00 2001 From: Amandeep Singh Date: Thu, 4 Oct 2018 22:39:30 +0530 Subject: [PATCH 1/4] Translated Joseph's Problem --- src/others/joseph_problem.md | 114 +++++++++++++++++++++++++++++++++++ 1 file changed, 114 insertions(+) create mode 100644 src/others/joseph_problem.md diff --git a/src/others/joseph_problem.md b/src/others/joseph_problem.md new file mode 100644 index 000000000..b6ca64ec0 --- /dev/null +++ b/src/others/joseph_problem.md @@ -0,0 +1,114 @@ + + +# Joseph's Problem + +Problem situation - Given the natural numbers $n$ and $k$. All natural numbers from $1$ to $n$ are written in a circle. First count the $k^{th}$ number starting from the first one and delete it. Then $k$ numbers are counted starting from the next one and the $k^{th}$ is removed again, and so on. The process stops when one number remains. It is required to find the last number. + +This task was set by **Joseph** (Flavius Josephus) in the 1st century (though in a somewhat narrower formulation: for $k = 2$). + +This problem can be solved by modeling. Simplest modeling will work $O(n^{2})$. Using [segment tree](https://cp-algorithms.com/data_structures/segment_tree.html), we can perform modeling in $O(n \log n)$. + +## Modeling a $O(n)$ solution + +We will try to find a pattern expressing the answer for the problem $J_{n, k}$ through the solution of the previous problems. + +Using modeling, we construct a table of values, for example, the following: + +$$\begin{matrix} n\setminus k & 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10\cr +1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1\cr +2 & 2 & 1 & 2 & 1 & 2 & 1 & 2 & 1 & 2 & 1\cr +3 & 3 & 3 & 2 & 2 & 1 & 1 & 3 & 3 & 2 & 2\cr +4 & 4 & 1 & 1 & 2 & 2 & 3 & 2 & 3 & 3 & 4\cr +5 & 5 & 3 & 4 & 1 & 2 & 4 & 4 & 1 & 2 & 4\cr +6 & 6 & 5 & 1 & 5 & 1 & 4 & 5 & 3 & 5 & 2\cr +7 & 7 & 7 & 4 & 2 & 6 & 3 & 5 & 4 & 7 & 5\cr +8 & 8 & 1 & 7 & 6 & 3 & 1 & 4 & 4 & 8 & 7\cr +9 & 9 & 3 & 1 & 1 & 8 & 7 & 2 & 3 & 8 & 8\cr +10 & 10 & 5 & 4 & 5 & 3 & 3 & 9 & 1 & 7 & 8\cr +\end{matrix}$$ + +And here we can clearly see the following **pattern**: + +$$J_ {n, k} = (J _ {(n-1), k} + k - 1) \ \% \ n + 1 $$ +$$J_ {1, k} = 1 $$ + +Here, 1-indexing somewhat spoils the elegance of the formula; but if you number the positions from scratch, you get a very clear formula: + +$$J_ {n, k} = (J _ {(n-1), k} + k) \ \% \ n = \sum_{i=1}^n k \ \% \ i$$ + +So, we found a solution to the problem of Joseph, working for $O(n)$ operations. + +Simple **recursive implementation** (in 1-indexing) + +``` + int joseph (int n, int k) { + return n>1 ? (joseph (n-1, k) + k - 1) % n + 1 : 1; + } +``` + +**Non-recursive form** : +``` + int joseph (int n, int k) { + int res = 0; + for (int i=1; i<=n; ++i) + res = (res + k) % i; + return res + 1; + } +``` +## Modeling a $O(k \log n)$ solution + +For relatively small $k$ we can come up with a more optimal solution than the above recursive solution in $O(n)$. If $k$ is small, then it is even intuitively clear that the algorithm does a lot of unnecessary actions, serious changes occur only when taking modulo $n$, and up to this point the algorithm simply adds the number $k$ to the answer several times. Accordingly, we can get rid of these unnecessary steps. + +A small complication arising from this is that after removing these numbers we will have a task with a smaller $n$, but the starting position is not in the first number, and somewhere in other place. Therefore, by invoking recursively ourselves from a problem with a new $n$, we then have to carefully transfer the result into our numbering system from its own. + +Also, we need to analyze the case when $n$ becomes less than $k$ - In this case, the above optimization will degenerate into an infinite loop. + +**Implementation** (for convenience in 0-indexing): +``` + int joseph (int n, int k) { + if (n == 1) return 0; + if (k == 1) return n-1; + if (k > n) return (joseph (n-1, k) + k) % n; + int cnt = n / k; + int res = joseph (n - cnt, k); + res -= n % k; + if (res < 0) res += n; + else res += res / (k - 1); + return res; + } +``` +Let us estimate the **asymptotics of** this algorithm. Immediately note that the case $n < k$ is analyzed by the old solution, which will work in this case for $O(k)$. Now consider the algorithm itself. In fact, on each iteration of it, instead of $n$ numbers, we get about $n \left( 1 - \frac{1}{k} \right)$ numbers, so the total number of $x$ iterations of the algorithm can be found roughly from the following equation: + +$$ n \left(1 - \frac{1}{k} \right) ^ x = 1, $$ + +on taking logarithm, we obtain: + +$$\ln n + x \ln \left(1 - \frac{1}{k} \right) = 0,$$ +$$x = - \frac{\ln n}{\ln \left(1 - \frac{1}{k} \right)},$$ + +using the decomposition of the logarithm into Taylor series, we obtain an approximate estimate: + +$$x \approx k \ln n$$ + +Thus, the asymptotics of the algorithm is actually $O (k \log n)$. + +## Analytical solution for $k = 2$ + +In this particular case (in which this task was set by Josephus Flavius) the problem is solved much easier. + +In the case of even $n$ we get that all even numbers will be crossed out, and then there will be a problem remaining for $\frac{n}{2}$, then the answer for $n$ will be obtained from the answer for $\frac{n}{2}$ by multiplying by two and subtracting one (by shifting positions): + +$$ J_{2n, 2} = 2 J_{n, 2} - 1 $$ + +Similarly, in the case of an odd $n$, all even numbers will be crossed out, then the first number, and the problem for $\frac{n-1}{2}$ will remain, and taking into account the shift of positions, we obtain the second formula: + +$$J_{2n+1,2} = 2 J_{n, 2} + 1 $$ + +We can use this recurrent dependency directly in our implementation. This pattern can be translated into another form: $J_{n, 2}$ represents a sequence of all odd numbers, "restarting" from one whenever $n$ turns out to be a power of two. This can be written as a single formula: + +$$J_{n, 2} = 1 + 2 \left(n-2^{\lfloor \log_2 n \rfloor} \right)$$ + +## Analytical solution for $k > 2$ + +Despite the simple form of the problem and a large number of articles on this and related problems, a simple analytical representation of the solution of Joseph's problem has not yet been found. For small $k$, some formulas are derived, but apparently they are all difficult to apply in practice (for example, see Halbeisen, Hungerbuhler "the Josephus Problem" and Odlyzko, Wilf "Functional iteration and the Josephus problem"). + From 2e9c5fddd16682ac001e6bc35d1aeeec1661a322 Mon Sep 17 00:00:00 2001 From: Jakob Kogler Date: Sat, 6 Oct 2018 20:14:01 +0200 Subject: [PATCH 2/4] Explanation about general case --- src/index.md | 1 + src/others/joseph_problem.md | 89 +++++++++++++++++++++++------------- 2 files changed, 57 insertions(+), 33 deletions(-) diff --git a/src/index.md b/src/index.md index 9b15a6752..2549073c9 100644 --- a/src/index.md +++ b/src/index.md @@ -151,6 +151,7 @@ especially popular in field of competitive programming.* - [Optimal schedule of jobs given their deadlines and durations](./schedules/schedule-with-completion-duration.html) ### Miscellaneous +- [Joseph's Problem](./others/joseph_problem.html) - [15 Puzzle Game: Existence Of The Solution](./others/15-puzzle.html) --- diff --git a/src/others/joseph_problem.md b/src/others/joseph_problem.md index b6ca64ec0..4feaf4fc4 100644 --- a/src/others/joseph_problem.md +++ b/src/others/joseph_problem.md @@ -2,17 +2,21 @@ # Joseph's Problem -Problem situation - Given the natural numbers $n$ and $k$. All natural numbers from $1$ to $n$ are written in a circle. First count the $k^{th}$ number starting from the first one and delete it. Then $k$ numbers are counted starting from the next one and the $k^{th}$ is removed again, and so on. The process stops when one number remains. It is required to find the last number. +Problem situation - Given the natural numbers $n$ and $k$. +All natural numbers from $1$ to $n$ are written in a circle. First count the $k$-th number starting from the first one and delete it. Then $k$ numbers are counted starting from the next one and the $k$-th one is removed again, and so on. +The process stops when one number remains. It is required to find the last number. This task was set by **Joseph** (Flavius Josephus) in the 1st century (though in a somewhat narrower formulation: for $k = 2$). -This problem can be solved by modeling. Simplest modeling will work $O(n^{2})$. Using [segment tree](https://cp-algorithms.com/data_structures/segment_tree.html), we can perform modeling in $O(n \log n)$. +This problem can be solved by modeling the procedure. +Brute force modeling will work $O(n^{2})$. Using a [segment tree](/data_structures/segment_tree.html) we can improve it to $O(n \log n)$. +We want something better though. ## Modeling a $O(n)$ solution We will try to find a pattern expressing the answer for the problem $J_{n, k}$ through the solution of the previous problems. -Using modeling, we construct a table of values, for example, the following: +Using brute force modeling we can construct a table of values, for example, the following: $$\begin{matrix} n\setminus k & 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10\cr 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1 & 1\cr @@ -29,55 +33,74 @@ $$\begin{matrix} n\setminus k & 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10\cr And here we can clearly see the following **pattern**: -$$J_ {n, k} = (J _ {(n-1), k} + k - 1) \ \% \ n + 1 $$ +$$J_ {n, k} = (J _ {(n-1), k} + k - 1) \ \bmod n + 1 $$ $$J_ {1, k} = 1 $$ -Here, 1-indexing somewhat spoils the elegance of the formula; but if you number the positions from scratch, you get a very clear formula: +Here, 1-indexing somewhat spoils the elegance of the formula; if you number the positions from 0, you get a very clear formula: -$$J_ {n, k} = (J _ {(n-1), k} + k) \ \% \ n = \sum_{i=1}^n k \ \% \ i$$ +$$J_ {n, k} = (J _ {(n-1), k} + k) \ \bmod n$$ -So, we found a solution to the problem of Joseph, working for $O(n)$ operations. +So, we found a solution to the problem of Joseph, working in $O(n)$ operations. Simple **recursive implementation** (in 1-indexing) ``` - int joseph (int n, int k) { - return n>1 ? (joseph (n-1, k) + k - 1) % n + 1 : 1; - } +int joseph(int n, int k) { + return n > 1 ? (joseph(n-1, k) + k - 1) % n + 1 : 1; +} ``` **Non-recursive form** : + ``` - int joseph (int n, int k) { - int res = 0; - for (int i=1; i<=n; ++i) - res = (res + k) % i; - return res + 1; - } +int joseph(int n, int k) { + int res = 0; + for (int i = 1; i <= n; ++i) + res = (res + k) % i; + return res + 1; +} ``` -## Modeling a $O(k \log n)$ solution -For relatively small $k$ we can come up with a more optimal solution than the above recursive solution in $O(n)$. If $k$ is small, then it is even intuitively clear that the algorithm does a lot of unnecessary actions, serious changes occur only when taking modulo $n$, and up to this point the algorithm simply adds the number $k$ to the answer several times. Accordingly, we can get rid of these unnecessary steps. +This formula can also be found analytically. +Again here we assume 0-indexed. +After we killed the first person, we have $n-1$ people left. +And when we repeat the procedure then we will start with the person that had originally the index $k \bmod m$. +$J_{(n-1), k}$ would be the answer for the remaining circle, if we start counting at $0$, but because we actually start with $k$ we have $J_ {n, k} = (J _ {(n-1), k} + k) \ \bmod n$. -A small complication arising from this is that after removing these numbers we will have a task with a smaller $n$, but the starting position is not in the first number, and somewhere in other place. Therefore, by invoking recursively ourselves from a problem with a new $n$, we then have to carefully transfer the result into our numbering system from its own. -Also, we need to analyze the case when $n$ becomes less than $k$ - In this case, the above optimization will degenerate into an infinite loop. +## Modeling a $O(k \log n)$ solution + +For relatively small $k$ we can come up with a better solution than the above recursive solution in $O(n)$. +If $k$ is a lot smaller than $n$, then we can kill multiple people ($\lfloor \frac{n}{k} \rfloor$) in one run without looping over. +Afterwards we have $n - \lfloor \frac{n}{k} \rfloor$ people left, and we start with the $(\lfloor \frac{n}{k} \rfloor \cdot n)$-th person. +So we have to shift by that many. +We can notice that $\lfloor \frac{n}{k} \rfloor \cdot n$ is simply $n \bmod k$. +And since we removed every $k$-th person, we have to add the number of people that we removed before the result index. + +Also, we need to handle the case when $n$ becomes less than $k$ - in this case, the above optimization would degenerate into an infinite loop. **Implementation** (for convenience in 0-indexing): + ``` - int joseph (int n, int k) { - if (n == 1) return 0; - if (k == 1) return n-1; - if (k > n) return (joseph (n-1, k) + k) % n; - int cnt = n / k; - int res = joseph (n - cnt, k); - res -= n % k; - if (res < 0) res += n; - else res += res / (k - 1); - return res; - } +int joseph(int n, int k) { + if (n == 1) + return 0; + if (k == 1) + return n-1; + if (k > n) + return (joseph(n-1, k) + k) % n; + int cnt = n / k; + int res = joseph(n - cnt, k); + res -= n % k; + if (res < 0) + res += n; + else + res += res / (k - 1); + return res; +} ``` -Let us estimate the **asymptotics of** this algorithm. Immediately note that the case $n < k$ is analyzed by the old solution, which will work in this case for $O(k)$. Now consider the algorithm itself. In fact, on each iteration of it, instead of $n$ numbers, we get about $n \left( 1 - \frac{1}{k} \right)$ numbers, so the total number of $x$ iterations of the algorithm can be found roughly from the following equation: + +Let us estimate the **complexity** of this algorithm. Immediately note that the case $n < k$ is analyzed by the old solution, which will work in this case for $O(k)$. Now consider the algorithm itself. In fact, on each iteration of it, instead of $n$ numbers, we get about $n \left( 1 - \frac{1}{k} \right)$ numbers, so the total number of $x$ iterations of the algorithm can be found roughly from the following equation: $$ n \left(1 - \frac{1}{k} \right) ^ x = 1, $$ @@ -90,7 +113,7 @@ using the decomposition of the logarithm into Taylor series, we obtain an approx $$x \approx k \ln n$$ -Thus, the asymptotics of the algorithm is actually $O (k \log n)$. +Thus, the complexity of the algorithm is actually $O (k \log n)$. ## Analytical solution for $k = 2$ From d7c586362b5a2781c93899817d85c6b5062c8de7 Mon Sep 17 00:00:00 2001 From: Jakob Kogler Date: Sat, 6 Oct 2018 20:19:17 +0200 Subject: [PATCH 3/4] Rename problem part 1 --- src/index.md | 2 +- src/{others/joseph_problem.md => josephus_problem.md} | 0 2 files changed, 1 insertion(+), 1 deletion(-) rename src/{others/joseph_problem.md => josephus_problem.md} (100%) diff --git a/src/index.md b/src/index.md index 2549073c9..d35f6eb5f 100644 --- a/src/index.md +++ b/src/index.md @@ -151,7 +151,7 @@ especially popular in field of competitive programming.* - [Optimal schedule of jobs given their deadlines and durations](./schedules/schedule-with-completion-duration.html) ### Miscellaneous -- [Joseph's Problem](./others/joseph_problem.html) +- [Josephus problem](./others/josephus_problem.html) - [15 Puzzle Game: Existence Of The Solution](./others/15-puzzle.html) --- diff --git a/src/others/joseph_problem.md b/src/josephus_problem.md similarity index 100% rename from src/others/joseph_problem.md rename to src/josephus_problem.md From f3515f82747baddafc78000439cc65c5a3c2db41 Mon Sep 17 00:00:00 2001 From: Jakob Kogler Date: Sat, 6 Oct 2018 20:19:29 +0200 Subject: [PATCH 4/4] Rename part 2 --- src/josephus_problem.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/src/josephus_problem.md b/src/josephus_problem.md index 4feaf4fc4..3b246eaa7 100644 --- a/src/josephus_problem.md +++ b/src/josephus_problem.md @@ -1,12 +1,12 @@ - + -# Joseph's Problem +# Josephus Problem Problem situation - Given the natural numbers $n$ and $k$. All natural numbers from $1$ to $n$ are written in a circle. First count the $k$-th number starting from the first one and delete it. Then $k$ numbers are counted starting from the next one and the $k$-th one is removed again, and so on. The process stops when one number remains. It is required to find the last number. -This task was set by **Joseph** (Flavius Josephus) in the 1st century (though in a somewhat narrower formulation: for $k = 2$). +This task was set by **Flavius Josephus** in the 1st century (though in a somewhat narrower formulation: for $k = 2$). This problem can be solved by modeling the procedure. Brute force modeling will work $O(n^{2})$. Using a [segment tree](/data_structures/segment_tree.html) we can improve it to $O(n \log n)$. @@ -45,7 +45,7 @@ So, we found a solution to the problem of Joseph, working in $O(n)$ operations. Simple **recursive implementation** (in 1-indexing) ``` -int joseph(int n, int k) { +int josephus(int n, int k) { return n > 1 ? (joseph(n-1, k) + k - 1) % n + 1 : 1; } ``` @@ -53,7 +53,7 @@ int joseph(int n, int k) { **Non-recursive form** : ``` -int joseph(int n, int k) { +int josephus(int n, int k) { int res = 0; for (int i = 1; i <= n; ++i) res = (res + k) % i; @@ -82,7 +82,7 @@ Also, we need to handle the case when $n$ becomes less than $k$ - in this case, **Implementation** (for convenience in 0-indexing): ``` -int joseph(int n, int k) { +int josephus(int n, int k) { if (n == 1) return 0; if (k == 1) pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy