Random Variables & Distribution Functions MCQ Quiz - Objective Question with Answer for Random Variables & Distribution Functions - Download Free PDF

Last updated on Jul 7, 2025

Latest Random Variables & Distribution Functions MCQ Objective Questions

Random Variables & Distribution Functions Question 1:

Suppose that a sequence of random variables {Xn}n ≥ 1 and the random variable X are defined on the same probability space. Then which of the following statements are true? 

  1. Xn converges to X almost surely as n → ∞ implies that Xn, converges to X in probability as n → ∞. 
  2. Xn converges to X in probability as n → ∞ implies that Xn, converges to X almost surely as n → ∞
  3. \(\rm \Sigma_{n=1}^\infty P||X_n-X|>δ|<\infty\) for all δ > 0, then Xn converges to X almost surely as n → ∞
  4. If Xn converges to X in distribution as n → ∞, and X is a constant with probability 1, then Xn, converges to X in probability as n → ∞ . 

Answer (Detailed Solution Below)

Option :

Random Variables & Distribution Functions Question 1 Detailed Solution

We will update the solution soon.

Random Variables & Distribution Functions Question 2:

Consider a linear model 

Yi = β1 + β2 + .... + βi + ϵi, 1 ≤ i ≤ n,

where errors ϵi’s are uncorrelated with zero mean and finite variance σ2 > 0. Let β̂i be the best linear unbiased estimator (BLUE) of ;, i = 1,2,...,n. Then, which of the following statements are true? 

  1. The sum of squares residuals is strictly positive with probability 1 . 
  2. For every βi 1 ≤ i ≤ n, there are infinitely many linear unbiased estimators. 
  3. Var(β̂1) = σ2
  4. Y3 - Y2 is the BLUE of β3.

Answer (Detailed Solution Below)

Option :

Random Variables & Distribution Functions Question 2 Detailed Solution

We will update the solution soon.

Random Variables & Distribution Functions Question 3:

Let X be a random variable with the cumulative distribution function (CDF) given by \(F(x) = \begin{cases} 0, & x < 0, \\ \frac{x+2}{5}, & 0 \leq x < 3, \\ 1, & x \geq 3. \end{cases}\)

Find the value of \(P\left(1 < X \leq 2\right) + P(X = 0) \) ?

  1. 1/5
  2. 2/5
  3. 3/5
  4. 4/5

Answer (Detailed Solution Below)

Option 1 : 1/5

Random Variables & Distribution Functions Question 3 Detailed Solution

Solution:  

we use the properties of the cumulative distribution function (CDF).

Step 1: Calculate \(P(1 < X ≤ 2)\)

Using the CDF properties, for \(a < X ≤ b\) :

\(P(a < X ≤ b) = F(b) - F(a)\).

Here, a = 1 and b = 2 .

For 0 ≤ x < 3 , the CDF is given by \(F(x) = \frac{x+2}{5}\) .

Substituting these values:

\(F(2) = \frac{2 + 2}{5} = \frac{4}{5}, \quad F(1) = \frac{1 + 2}{5} = \frac{3}{5}.\)

Thus:

\(P(1 < X ≤ 2) = F(2) - F(1) = \frac{4}{5} - \frac{3}{5} = \frac{1}{5}.\)

Step 2: Calculate P(X = 0)

The probability P(X = 0) corresponds to the probability mass at X = 0 .

Since the given random variable is continuous, P(X = 0) = 0 .

Step 3: Add the probabilities

Combining the results:

\(P(1 < X ≤ 2) + P(X = 0) = \frac{1}{5} + 0 = \frac{1}{5}.\)

Hence the correct option is (1)

 

Random Variables & Distribution Functions Question 4:

Let X and Y be jointly distributed continuous random variables with joint probability density function \(\rm f(x, y)=\left\{\begin{matrix}\frac{x}{y}, & if\ 0

Which of the following statements are true? 

  1. \(\rm P\left(X<\frac{1}{2}|Y=1\right)=\frac{1}{4}\)
  2. E(Y) = \(\frac{1}{4}\)
  3. \(\rm P\left(X < \frac{Y}{2}\right)=\frac{1}{4}\)
  4. \(\rm E\left(\frac{Y}{X}\right)=\frac{1}{4}\)

Answer (Detailed Solution Below)

Option :

Random Variables & Distribution Functions Question 4 Detailed Solution

The Correct answers are (1) and (3).

We will update the solution later.

Random Variables & Distribution Functions Question 5:

Let {Xn}n≥1 be a sequence of independent and identically distributed random variables with E(X1) = 0 and Var(X1) = 1. Which of the following statements are true? 

  1. \(\rm \lim_{n\rightarrow \infty}P\left(\frac{\sqrt n \Sigma_{i=1}^nX_i}{\Sigma_{i=1}^nX_i^2}\le 0\right)=\frac{1}{2}\)
  2. \(\frac{ \Sigma_{i=1}^nX_i}{\Sigma_{i=1}^nX_i^2}\) converges in probability to 0 as n → ∞
  3. \(\rm \frac{1}{n}\Sigma_{i=1}^nX_i^2\) converges in probability to 1 as n → ∞
  4. \(\rm \lim_{n\rightarrow \infty}P\left(\frac{ \Sigma_{i=1}^nX_i}{\sqrt n}\le 0\right)=\frac{1}{2}\)

Answer (Detailed Solution Below)

Option :

Random Variables & Distribution Functions Question 5 Detailed Solution

Concept:

1. Law of Large Numbers (LLN):

The Law of Large Numbers states that as the sample size n increases, the sample average (or sum) of i.i.d.

random variables converges to the expected value of the variable. For example, in Option 3, \(\frac{1}{n} \sum_{i=1}^n X_i^2\) 

converges to the expected value \(E(X_1^2) = 1\), because \(X_1\) has variance 1.

2. Central Limit Theorem (CLT):

The Central Limit Theorem tells us that the sum (or scaled average) of i.i.d. random variables with finite mean and

variance converges in distribution to a normal distribution as \(n \to \infty\) . For instance, in Option 4,  \(\frac{\sum_{i=1}^n X_i}{\sqrt{n}}\) behaves like a

standard normal random variable as  \(n \to \infty\) , converging to N(0, 1) .

3. Probability Limits:

For certain random variables, their distribution converges to a fixed probability value. In Options 1 and 4,

the probability of the standardized sum being less than or equal to 0 converges to \(\frac{1}{2} \), which is the probability

that a standard normal variable is less than or equal to 0.

Explanation:

Option 1: This expression involves the ratio of two terms: \(\sqrt{n} \sum_{i=1}^n X_i and \sum_{i=1}^n X_i^2 \).

The numerator,\( \sqrt{n} \sum_{i=1}^n X_i \), grows like \(O(\sqrt{n})\) by the Central Limit Theorem since the sum of i.i.d. random

variables with mean zero and variance 1 tends to have a normal distribution.

The denominator, \(\sum_{i=1}^n X_i^2\) , grows like O(n) because it is the sum of the squares of i.i.d. random variables with variance 1.

As \(n \to \infty \), the ratio \(\frac{\sqrt{n} \sum_{i=1}^n X_i}{\sum_{i=1}^n X_i^2}\) tends to zero, so the probability converges to the probability of the random variable being

less than or equal to 0. Hence, Option 1 is true, and the limit will be 1/2 because this becomes a standard normal random variable under large n .

Option 2: The numerator \( \sum_{i=1}^n X_i\) behaves like \(O(\sqrt{n})\) , and the denominator \(\sum_{i=1}^n X_i^2\) behaves like O(n) . Therefore, the ratio \(\frac{\sum_{i=1}^n X_i}{\sum_{i=1}^n X_i^2}\) behaves

like \(O(1/\sqrt{n}) \) and converges to 0 as \(n \to \infty\) . Thus, Option 2 is true.

Option 3: By the Law of Large Numbers (LLN), the average of the squares of i.i.d. random variables converges to the

expected value of \(X_1^2 \), which is 1, as \(n \to \infty \). Thus, Option 3 is true.

Option 4: 
\(\)
By the Central Limit Theorem (CLT), \(\frac{\sum_{i=1}^n X_i}{\sqrt{n}} \) converges in distribution to a standard normal random variable N(0, 1) .

The probability that a standard normal variable is less than or equal to 0 is \(P(Z \leq 0) = \frac{1}{2} \). Hence, Option 4 is true.
 

All four options are correct.

Top Random Variables & Distribution Functions MCQ Objective Questions

Which of the following is a valid cumulative distribution function?

  1. F(x) = \(\begin{cases}\rm \frac{1}{2+x^2} & \text { if } \rm x<0, \\\rm \frac{2+x^2}{3+x^2} & \text { if } \rm x \geq 0\end{cases}\)
  2. F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{2+x^2}{3+2 x^2} & \text { if } \rm x \geq 0\end{cases}\)
  3. F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0 \text {, } \\ \rm\frac{2 \cos (x)+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)
  4. F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{1+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)

Answer (Detailed Solution Below)

Option 1 : F(x) = \(\begin{cases}\rm \frac{1}{2+x^2} & \text { if } \rm x<0, \\\rm \frac{2+x^2}{3+x^2} & \text { if } \rm x \geq 0\end{cases}\)

Random Variables & Distribution Functions Question 6 Detailed Solution

Download Solution PDF

Concept:

Let F(x) be a cumulative distribution function then

(i) \(\lim_{x\to-\infty}F(x)\) = 0, \(\lim_{x\to\infty}F(x)\) = 1

(ii) F is non-decreasing function

Explanation:

(2): F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{2+x^2}{3+2 x^2} & \text { if } \rm x \geq 0\end{cases}\)

\(\lim_{x\to\infty}F(x)\) = \(\lim_{x\to\infty}\frac{2+x^2}{3+2x^2}\) = \(\lim_{x\to\infty}\frac{2x}{4x}\) = \(\frac12\) (Using L'hospital rule), not satisfying

Option (2) is false

(3): F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0 \text {, } \\ \rm\frac{2 \cos (x)+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)

\(\lim_{x\to\infty}F(x)\) = \(\lim_{x\to\infty}\frac{2\cos x+x^2}{4+x^2}\) 

 = \(\lim_{x\to\infty}\frac{-2\sin x+2x}{2x}\) (Using L'hospital rule)

= \(\lim_{x\to\infty}(-\frac{\sin x}{x}+1)\) - 1 + 1 = 0, not satisfying

Option (3) is false

(4): F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{1+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)

f(0-) = 1/2 and f(0+) = 1/4 and \(\frac12>\frac14\) so F(x) not satisfying the property "F is non-decreasing function"

Option (4) is false

Therefore option (1) is correct

Let X be a Poisson random variable with mean λ. Which of the following parametric function is not estimable?

  1. λ−1
  2. λ
  3. λ2
  4. e−λ​

Answer (Detailed Solution Below)

Option 1 : λ−1

Random Variables & Distribution Functions Question 7 Detailed Solution

Download Solution PDF

Concept:

parametric function f(λ) is said to be estimable if there exist g(X) such that E(g(X)) = f(λ) otherwise it is called not estimable.

Explanation:

Given X be a Poisson random variable with mean λ

So E(X) = λ and Var(X) = λ

We have to find the parametric function which is not estimable.

(2): E(X) = λ so here we are getting a function g(X) = X

So it is estimable

Option (2) is false

(3): E(X2) = [E(X)]2 + Var(X) = λ2 + λ

So E(X2 - X) = E(X2) - E(X) =  λ2 + λ - λ = λ2 

Here we are getting the function g(X) = X2 - X

 So it is estimable

Option (3) is false

(4): E\((∑ \frac{(-1)^x\lambda^x}{x!})\) = e−λ​

Here we are getting the function g(X) = \((∑ \frac{(-1)^x\lambda^x}{x!})\)

 So it is estimable 

Option (4) is false

Hence option (1) is correct

Let X be a random variable with cumulative distribution function given by \(\rm F(x)=\left\{\begin{matrix}0,&\ if\ x<0\\\ \frac{x+1}{3},&\ if\ 0\le x < 1\\\ 1, & \ if\ x \ge 1\end{matrix}\right.\) Then the value of \(\rm P\left(\frac{1}{3} is equal to

  1. \(\frac{7}{36}\)
  2. \(\frac{11}{36}\)
  3. \(\frac{13}{36}\)
  4. \(\frac{17}{36}\)

Answer (Detailed Solution Below)

Option 4 : \(\frac{17}{36}\)

Random Variables & Distribution Functions Question 8 Detailed Solution

Download Solution PDF

Concepts Used:

1. Cumulative Distribution Function (CDF):

The CDF F(x) gives the probability that the random variable X takes a value less than or equal to x . That is, F(x) = P(X ≤ x) .

2. Finding Probability Using the CDF:

The probability that the random variable X lies within a certain interval (a, b] is given by:
     
 P(a < X ≤ b) = F(b) - F(a)

3. Probability at a Specific Point (Jump Discontinuity):

The probability at a specific point x = c is the difference in the CDF just to the right and just to the left of c :
     
P(X = c) = F(c+) - F(c-)

Explanation -

We are given a cumulative distribution function (CDF) F(x) of a random variable X as:

F(x) = \(\begin{cases} 0, & \text{if } x < 0 \\ \frac{x+1}{3}, & \text{if } 0 \leq x < 1 \\ 1, & \text{if } x \geq 1 \end{cases}\)

From the definition of the probability from the CDF:  P(a < X \(\le \) b) = F(b) - F(a)

In this case, we need to calculate \(F\left(\frac{3}{4}\right) and \ F\left(\frac{1}{3}\right) \).

Since \( 0 \le \frac{1}{3} < 1\)  and \(0 \le \frac{3}{4} < 1\) , we use the formula \(F(x) = \frac{x+1}{3}\) for both 1/3  and 3/4 :

\(F\left(\frac{1}{3}\right) = \frac{\frac{1}{3} + 1}{3} = \frac{\frac{4}{3}}{3} = \frac{4}{9}\)

\(\Rightarrow F\left(\frac{3}{4}\right) = \frac{\frac{3}{4} + 1}{3} = \frac{\frac{7}{4}}{3} = \frac{7}{12}\)

Thus, the probability \(P\left(\frac{1}{3} < X \leq \frac{3}{4}\right)\) is:

\(P\left(\frac{1}{3} < X \leq \frac{3}{4}\right) = F\left(\frac{3}{4}\right) - F\left(\frac{1}{3}\right) = \frac{7}{12} - \frac{4}{9}\)

= \(\frac{21}{36} - \frac{16}{36} = \frac{5}{36}\)

The probability at a point is the jump in the CDF at that point. We need to calculate F(0+) - F(0-) .

From the CDF definition: F(0+) = F(0) =\( \frac{0+1}{3} = \frac{1}{3}\)

⇒ F(0-) = 0

Thus, \(P(X = 0) = F(0^+) - F(0^-) = \frac{1}{3} - 0 = \frac{1}{3} = \frac{12}{36}\)

 

Now, we add the two results:

\(P\left(\frac{1}{3} < X \leq \frac{3}{4}\right) + P(X = 0) = \frac{5}{36} + \frac{12}{36} = \frac{17}{36}\)

Thus, the final answer is 17/36.

Let X1, X2, ... be i.i.d. random variables having a χ2-distribution with 5 degrees of freedom.
Let a ∈ \(\mathbb{R}\) be constant. Then the limiting distribution of \(a\left(\frac{X_1+\cdots+X_n-5 n}{\sqrt{n}}\right)\) is 

  1. Gamma distribution for an appropriate value of a
  2. χ2-distribution for an appropriate value of a
  3. Standard normal distribution for an appropriate value of a
  4. A degenerate distribution for an appropriate value of a

Answer (Detailed Solution Below)

Option 3 : Standard normal distribution for an appropriate value of a

Random Variables & Distribution Functions Question 9 Detailed Solution

Download Solution PDF

Given:-

X1, X2, ... are i.i.d. random variables having a χ2-distribution with 5 degrees of freedom.

Concept Used:-

The limiting distribution of the given expression can be found using the central limit theorem.

The central limit theorem states that the sum of many independent and identically distributed random variables, properly normalized, converges in distribution to a normal distribution.

Explanation:-

Here, we have n i.i.d. random variables with a χ2-distribution with 5 degrees of freedom.

The mean of each χ2-distributed variable is 5 and the variance is,

2 × 5 = 10

Therefore, the mean of the sum of n such variables is n5, and the variance is,

⇒ variance = (n × 10)

We can normalize the expression by subtracting the mean and dividing by the standard deviation. That is,

\(⇒a[\dfrac{(X_1 + X_ 2 + ... + X_ n - 5n)}{\sqrt{n×10}}] \)

\(=(\dfrac{a}{\sqrt{10}}) [\dfrac{(X_ 1 + X_ 2 + ... + X_ n - 5n)}{n}] \sqrt{n} \)

The term in the brackets on the right-hand side is the sum of n i.i.d. random variables with a mean of 0 and a variance of 1/2.

Therefore, by the CLT, this term converges in distribution to a standard normal distribution as n goes to infinity.

The overall expression converges in distribution to a normal distribution with mean zero and variance a2/10.

So, the limiting distribution of \(a\left(\frac{X_1+\cdots+X_n-5 n}{\sqrt{n}}\right)\) is ​​the standard normal distribution for an appropriate value of a.

Hence, the correct option is 3.

Random Variables & Distribution Functions Question 10:

For n ≥ p +1, let \(\underline{X_1}, \underline{X_2}, \ldots, \underline{X_n}\) be a random sample from \(N_p(\underline{\mu}, \Sigma), \underline{\mu} \in \mathbb{R}^p\) and Σ is a positive definite matrix. Define \(\underline{\bar{X}}=\frac{1}{n} \sum_{i=1}^n \underline{X_i}\) and \(A=\sum_{i=1}^n(\underline{X_i}-\underline{\bar{X}})(\underline{X_i}-\underline{\bar{X}})^T\). Then the distribution of Trace(AΣ-1) is 

  1. Wp(n - 1, Σ) 
  2. \(\chi_p^2\)
  3. \(\chi_{n p}^2\)
  4. χ2(n - 1)p

Answer (Detailed Solution Below)

Option 4 : χ2(n - 1)p

Random Variables & Distribution Functions Question 10 Detailed Solution

The correct answer is option 4 

we will update the solution as soon as possible.

Random Variables & Distribution Functions Question 11:

Suppose that X is a random variable such that P(X ∈ {0, 1, 2}) = 1. If for some constant c, P(X = i) = cP(X = i - 1), i = 1, 2, then E[X] is

  1. \(\frac{1}{1+c+c^2}\)
  2. \(\frac{c+2 c^2}{1+c+c^2} \)
  3. \(\frac{c+c^2}{1+2 c} \)
  4. \(\frac{3 c}{1+2 c}\)

Answer (Detailed Solution Below)

Option 2 : \(\frac{c+2 c^2}{1+c+c^2} \)

Random Variables & Distribution Functions Question 11 Detailed Solution

The correct answer is option 2 

we will update the solution as soon as possible.

Random Variables & Distribution Functions Question 12:

Which of the following is a valid cumulative distribution function?

  1. F(x) = \(\begin{cases}\rm \frac{1}{2+x^2} & \text { if } \rm x<0, \\\rm \frac{2+x^2}{3+x^2} & \text { if } \rm x \geq 0\end{cases}\)
  2. F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{2+x^2}{3+2 x^2} & \text { if } \rm x \geq 0\end{cases}\)
  3. F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0 \text {, } \\ \rm\frac{2 \cos (x)+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)
  4. F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{1+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)

Answer (Detailed Solution Below)

Option 1 : F(x) = \(\begin{cases}\rm \frac{1}{2+x^2} & \text { if } \rm x<0, \\\rm \frac{2+x^2}{3+x^2} & \text { if } \rm x \geq 0\end{cases}\)

Random Variables & Distribution Functions Question 12 Detailed Solution

Concept:

Let F(x) be a cumulative distribution function then

(i) \(\lim_{x\to-\infty}F(x)\) = 0, \(\lim_{x\to\infty}F(x)\) = 1

(ii) F is non-decreasing function

Explanation:

(2): F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{2+x^2}{3+2 x^2} & \text { if } \rm x \geq 0\end{cases}\)

\(\lim_{x\to\infty}F(x)\) = \(\lim_{x\to\infty}\frac{2+x^2}{3+2x^2}\) = \(\lim_{x\to\infty}\frac{2x}{4x}\) = \(\frac12\) (Using L'hospital rule), not satisfying

Option (2) is false

(3): F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0 \text {, } \\ \rm\frac{2 \cos (x)+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)

\(\lim_{x\to\infty}F(x)\) = \(\lim_{x\to\infty}\frac{2\cos x+x^2}{4+x^2}\) 

 = \(\lim_{x\to\infty}\frac{-2\sin x+2x}{2x}\) (Using L'hospital rule)

= \(\lim_{x\to\infty}(-\frac{\sin x}{x}+1)\) - 1 + 1 = 0, not satisfying

Option (3) is false

(4): F(x) = \(\begin{cases}\rm\frac{1}{2+x^2} & \text { if } \rm x<0, \\ \rm\frac{1+x^2}{4+x^2} & \text { if } \rm x \geq 0\end{cases}\)

f(0-) = 1/2 and f(0+) = 1/4 and \(\frac12>\frac14\) so F(x) not satisfying the property "F is non-decreasing function"

Option (4) is false

Therefore option (1) is correct

Random Variables & Distribution Functions Question 13:

Let X be a Poisson random variable with mean λ. Which of the following parametric function is not estimable?

  1. λ−1
  2. λ
  3. λ2
  4. e−λ​

Answer (Detailed Solution Below)

Option 1 : λ−1

Random Variables & Distribution Functions Question 13 Detailed Solution

Concept:

parametric function f(λ) is said to be estimable if there exist g(X) such that E(g(X)) = f(λ) otherwise it is called not estimable.

Explanation:

Given X be a Poisson random variable with mean λ

So E(X) = λ and Var(X) = λ

We have to find the parametric function which is not estimable.

(2): E(X) = λ so here we are getting a function g(X) = X

So it is estimable

Option (2) is false

(3): E(X2) = [E(X)]2 + Var(X) = λ2 + λ

So E(X2 - X) = E(X2) - E(X) =  λ2 + λ - λ = λ2 

Here we are getting the function g(X) = X2 - X

 So it is estimable

Option (3) is false

(4): E\((∑ \frac{(-1)^x\lambda^x}{x!})\) = e−λ​

Here we are getting the function g(X) = \((∑ \frac{(-1)^x\lambda^x}{x!})\)

 So it is estimable 

Option (4) is false

Hence option (1) is correct

Random Variables & Distribution Functions Question 14:

Let X be a random variable with cumulative distribution function given by \(\rm F(x)=\left\{\begin{matrix}0,&\ if\ x<0\\\ \frac{x+1}{3},&\ if\ 0\le x < 1\\\ 1, & \ if\ x \ge 1\end{matrix}\right.\) Then the value of \(\rm P\left(\frac{1}{3} is equal to

  1. \(\frac{7}{36}\)
  2. \(\frac{11}{36}\)
  3. \(\frac{13}{36}\)
  4. \(\frac{17}{36}\)

Answer (Detailed Solution Below)

Option 4 : \(\frac{17}{36}\)

Random Variables & Distribution Functions Question 14 Detailed Solution

Concepts Used:

1. Cumulative Distribution Function (CDF):

The CDF F(x) gives the probability that the random variable X takes a value less than or equal to x . That is, F(x) = P(X ≤ x) .

2. Finding Probability Using the CDF:

The probability that the random variable X lies within a certain interval (a, b] is given by:
     
 P(a < X ≤ b) = F(b) - F(a)

3. Probability at a Specific Point (Jump Discontinuity):

The probability at a specific point x = c is the difference in the CDF just to the right and just to the left of c :
     
P(X = c) = F(c+) - F(c-)

Explanation -

We are given a cumulative distribution function (CDF) F(x) of a random variable X as:

F(x) = \(\begin{cases} 0, & \text{if } x < 0 \\ \frac{x+1}{3}, & \text{if } 0 \leq x < 1 \\ 1, & \text{if } x \geq 1 \end{cases}\)

From the definition of the probability from the CDF:  P(a < X \(\le \) b) = F(b) - F(a)

In this case, we need to calculate \(F\left(\frac{3}{4}\right) and \ F\left(\frac{1}{3}\right) \).

Since \( 0 \le \frac{1}{3} < 1\)  and \(0 \le \frac{3}{4} < 1\) , we use the formula \(F(x) = \frac{x+1}{3}\) for both 1/3  and 3/4 :

\(F\left(\frac{1}{3}\right) = \frac{\frac{1}{3} + 1}{3} = \frac{\frac{4}{3}}{3} = \frac{4}{9}\)

\(\Rightarrow F\left(\frac{3}{4}\right) = \frac{\frac{3}{4} + 1}{3} = \frac{\frac{7}{4}}{3} = \frac{7}{12}\)

Thus, the probability \(P\left(\frac{1}{3} < X \leq \frac{3}{4}\right)\) is:

\(P\left(\frac{1}{3} < X \leq \frac{3}{4}\right) = F\left(\frac{3}{4}\right) - F\left(\frac{1}{3}\right) = \frac{7}{12} - \frac{4}{9}\)

= \(\frac{21}{36} - \frac{16}{36} = \frac{5}{36}\)

The probability at a point is the jump in the CDF at that point. We need to calculate F(0+) - F(0-) .

From the CDF definition: F(0+) = F(0) =\( \frac{0+1}{3} = \frac{1}{3}\)

⇒ F(0-) = 0

Thus, \(P(X = 0) = F(0^+) - F(0^-) = \frac{1}{3} - 0 = \frac{1}{3} = \frac{12}{36}\)

 

Now, we add the two results:

\(P\left(\frac{1}{3} < X \leq \frac{3}{4}\right) + P(X = 0) = \frac{5}{36} + \frac{12}{36} = \frac{17}{36}\)

Thus, the final answer is 17/36.

Random Variables & Distribution Functions Question 15:

Suppose a continuous random variable X follows a Uniform (0, 4) distribution. Now define \(Y = \sqrt(X)\). Which of the following statements are NOT accurate for X and Y?

  1. The range of Y is [0, 2]
  2. Both X and Y have the same mean
  3. The variance of X is greater than the variance of Y
  4. The distribution of Y is also uniform

Answer (Detailed Solution Below)

Option :

Random Variables & Distribution Functions Question 15 Detailed Solution

Explanation -

For a random variable X that follows a Uniform (0, 4) distribution, X will take any value between 0 and 4 with equal probability.

The defined \(Y = \sqrt(X)\) will range from 0 to 2.

Option (1) - The range of Y is [0, 2]

 This statement is TRUE. \(Y = \sqrt(X)\) will take a value between 0 (for X=0) and 2 (for X=4).

Option (2) - Both X and Y have the same mean  

This statement is NOT TRUE. In a uniform distribution, the mean is given by (a + b) / 2.

The mean of X will be (0+4)/2 = 2, and the mean of Y will be (0+2)/2 = 1, so they do not have the same mean.

Option (3) - The variance of X is greater than the variance of Y

This statement is TRUE. Variance measures the deviation from the mean, and since X can take on values from a wider range (0 to 4, as opposed to 0 to 2 for Y), it will generally have a greater variance.

Option (4) - The distribution of Y is also uniform

This statement is NOT TRUE. When we apply a transformation like \(\sqrt(X)\), the resulting distribution Y will not retain the uniform property of X.

So, statements B and D are NOT true.

Hence the correct options are 2 and 4.

Get Free Access Now
Hot Links: teen patti joy vip teen patti master 51 bonus teen patti real teen patti circle