on 01-Jan-2024 (Mon)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Annotation 7608433249548

 Situations of the result of Hamming Decoding Using the resultant $$n − k$$ = 3-bit syndrome $$\mathbf{s}$$, the Hamming decoder decides if it thinks there are any bit errors in $$\hat{\mathbf{y}}$$: - If the syndrome is $$\mathbf{s} = \begin{array}{ccc}[ 0 &0 &0]\end{array}^{T}$$ then the Hamming decoder thinks there are no bit errors in $$\hat{\mathbf{y}}$$ (it may be wrong though). In this case, it outputs $$\hat{\mathbf{x}} = \begin{array}{cccc}[ \hat{y}_{3} &\hat{y}_{5} &\hat{y}_{6} &\hat{y}_{7} ]\end{array}^{T}$$ since $$y_{3} = x_{1}$$, $$y_{5} = x_{2}$$, $$y_{6} = x_{3}$$ and $$y_{7} = x_{4}$$ in $$\mathbf{G}$$. - If the syndrome $$\mathbf{s}$$ is not equal to $$\begin{array}{ccc}[ 0 &0 &0]\end{array}^{T}$$ then its 3-bit number is converted into a decimal number $$i ∈ [1, 7]$$. In this case, the Hamming decoder thinks that the ith bit in $$\hat{\mathbf{y}}$$ has been flipped by a bit error (it may be wrong though). The Hamming decoder flips the $$i$$th bit in $$\hat{\mathbf{y}}$$ before outputting $$\hat{\mathbf{x}}=\left[\begin{array}{llll}\hat{y}_3 & \hat{y}_5 & \hat{y}_6 & \hat{y}_7\end{array}\right]^T$$. If there are multiple bit errors in the received codeword $$\hat{\mathbf{y}}$$, the syndrome $$\mathbf{s}$$ identifies which bit of $$\hat{\mathbf{y}}$$ can be toggled to give the legitimate permutation of $$\mathbf{y}$$ that is most similar.
status not read

Flashcard 7608436919564

Question
Using the resultant $$n − k$$ = 3-bit syndrome $$\mathbf{s}$$, the Hamming decoder decides if it thinks there are any bit errors in $$\hat{\mathbf{y}}$$: [Answer all the situations of Hamming Decoding]
Answer

- If the syndrome is $$\mathbf{s} = \begin{array}{ccc}[ 0 &0 &0]\end{array}^{T}$$ then the Hamming decoder thinks there are no bit errors in $$\hat{\mathbf{y}}$$ (it may be wrong though). In this case, it outputs $$\hat{\mathbf{x}} = \begin{array}{cccc}[ \hat{y}_{3} &\hat{y}_{5} &\hat{y}_{6} &\hat{y}_{7} ]\end{array}^{T}$$ since $$y_{3} = x_{1}$$, $$y_{5} = x_{2}$$, $$y_{6} = x_{3}$$ and $$y_{7} = x_{4}$$ in $$\mathbf{G}$$.
- If the syndrome $$\mathbf{s}$$ is not equal to $$\begin{array}{ccc}[ 0 &0 &0]\end{array}^{T}$$ then its 3-bit number is converted into a decimal number $$i ∈ [1, 7]$$. In this case, the Hamming decoder thinks that the ith bit in $$\hat{\mathbf{y}}$$ has been flipped by a bit error (it may be wrong though). The Hamming decoder flips the $$i$$th bit in $$\hat{\mathbf{y}}$$ before outputting $$\hat{\mathbf{x}}=\left[\begin{array}{llll}\hat{y}_3 & \hat{y}_5 & \hat{y}_6 & \hat{y}_7\end{array}\right]^T$$. If there are multiple bit errors in the received codeword $$\hat{\mathbf{y}}$$, the syndrome $$\mathbf{s}$$ identifies which bit of $$\hat{\mathbf{y}}$$ can be toggled to give the legitimate permutation of $$\mathbf{y}$$ that is most similar.

status measured difficulty not learned 37% [default] 0

Situations of the result of Hamming Decoding
Using the resultant $$n − k$$ = 3-bit syndrome $$\mathbf{s}$$, the Hamming decoder decides if it thinks there are any bit errors in $$\hat{\mathbf{y}}$$: - If the syndrome is $$\mathbf{s} = \begin{array}{ccc}[ 0 &0 &0]\end{array}^{T}$$ then the Hamming decoder thinks there are no bit errors in $$\hat{\mathbf{y}}$$ (it may be wrong though). - In this case, it outputs $$\hat{\mathbf{x}} = \begin{array}{cccc}[ \hat{y}_{3} &\hat{y}_{5} &\hat{y}_{6} &\hat{y}_{7} ]\end{array}^{T}$$ since $$y_{3} = x_{1}$$, $$y_{5} = x_{2}$$, $$y_{6} = x_{3}$$ and $$y_{7} = x_{4}$$ in $$\mathbf{G}$$. - If the syndrome $$\mathbf{s}$$ is not equal to $$\begin{array}{ccc}[ 0 &0 &0]\end{array}^{T}$$ then its 3-bit number is converted into a decimal number $$i ∈ [1, 7]$$. - In this case, the Hamming decoder thinks that the ith bit in $$\hat{\mathbf{y}}$$ has been flipped by a bit error (it may be wrong though). The Hamming decoder flips the $$i$$th bit in $$\hat{\mathbf{y}}$$ before outputting $$\hat{\mathbf{x}}=\left[\begin{array}{llll}\hat{y}_3 & \hat{y}_5 & \hat{y}_6 & \hat{y}_7\end{array}\right]^T$$.

Annotation 7608761191692

 在整本书中，我们使用最古老的教学方法之一——对话（dialogue）。
status not read

pdf

cannot see any pdfs

Annotation 7609022024972

 Law of Total Probability Law of Total Probability: If $$M_{i}$$, $$i$$ = 1, 2, 3, . . . , is a partition of $$Ω$$ then $$P(A) = \sum\limits_{i} P(A ∩ M_{i}) = \sum\limits_{i} P(A|M_{i})P(M_{i})$$
status not read

Flashcard 7609023859980

Question

Law of Total Probability: [...]

Answer

If $$M_{i}$$, $$i$$ = 1, 2, 3, . . . , is a partition of $$Ω$$ then $$P(A) = \sum\limits_{i} P(A ∩ M_{i}) = \sum\limits_{i} P(A|M_{i})P(M_{i})$$

status measured difficulty not learned 37% [default] 0

Law of Total Probability
Law of Total Probability: If $$M_{i}$$, $$i$$ = 1, 2, 3, . . . , is a partition of $$Ω$$ then $$P(A) = \sum\limits_{i} P(A ∩ M_{i}) = \sum\limits_{i} P(A|M_{i})P(M_{i})$$

Annotation 7609025432844

 Bayes’ Theorem Bayes’ Theorem: For any two events $$A$$ and $$B$$ with $$P(B) > 0$$, $$P(A|B) = P(B|A) \frac{P(A)}{P(B)}$$
status not read

Flashcard 7609029365004

Question

Bayes’ Theorem: [...]

Answer
For any two events $$A$$ and $$B$$ with $$P(B) > 0$$, $$P(A|B) = P(B|A) \frac{P(A)}{P(B)}$$

status measured difficulty not learned 37% [default] 0

Bayes’ Theorem
Bayes’ Theorem: For any two events $$A$$ and $$B$$ with $$P(B) > 0$$, $$P(A|B) = P(B|A) \frac{P(A)}{P(B)}$$

Annotation 7609032510732

 随机变量的期望值 The expected value of $$X$$ is $$\displaystyle\operatorname{E}[X]=\int_{\Omega}X(\omega)P(d\omega)=\int_\mathbb{R}x\mu(dx)$$
status not read

Flashcard 7609038540044

Question
The expected value of $$X$$ is [...]
Answer
$$\displaystyle\operatorname{E}[X]=\int_{\Omega}X(\omega)P(d\omega)=\int_\mathbb{R}x\mu(dx)$$

status measured difficulty not learned 37% [default] 0

The expected value of $$X$$ is $$\displaystyle\operatorname{E}[X]=\int_{\Omega}X(\omega)P(d\omega)=\int_\mathbb{R}x\mu(dx)$$

Annotation 7609040112908

 随机变量的方差 The variance of $$X$$ is: \begin{aligned}\operatorname{Var}[X] & =\mathbf{E}\left[(X-\mathbf{E}[X])^2\right] \\& =\int_{\Omega}(X(\omega)-\mathbf{E}[X])^2 P(d \omega)=\int_{\mathbb{R}}(x-\mathbf{E}[X])^2 \mu(d x)\end{aligned}
status not read

Flashcard 7609042734348

Question
The variance of $$X$$ is: [...]
Answer

\begin{aligned}\operatorname{Var}[X] & =\mathbf{E}\left[(X-\mathbf{E}[X])^2\right] \\& =\int_{\Omega}(X(\omega)-\mathbf{E}[X])^2 P(d \omega)=\int_{\mathbb{R}}(x-\mathbf{E}[X])^2 \mu(d x)\end{aligned}

status measured difficulty not learned 37% [default] 0

The variance of $$X$$ is: \begin{aligned}\operatorname{Var}[X] & =\mathbf{E}\left[(X-\mathbf{E}[X])^2\right] \\& =\int_{\Omega}(X(\omega)-\mathbf{E}[X])^2 P(d \omega)=\int_{\mathbb{R}}(x-\mathbf{E}[X])^2 \mu(d x)\end{aligned}