# IB DP Maths Topic 7.1 Probability generating functions for discrete random variables HL Paper 3

## Question

If $$X$$ is a random variable that follows a Poisson distribution with mean $$\lambda > 0$$ then the probability generating function of $$X$$ is $$G(t) = {e^{\lambda (t – 1)}}$$.

(i)     Prove that $${\text{E}}(X) = \lambda$$.

(ii)     Prove that $${\text{Var}}(X) = \lambda$$.

[6]
a.

$$Y$$ is a random variable, independent of $$X$$, that also follows a Poisson distribution with mean $$\lambda$$.

If $$S = 2X – Y$$ find

(i)     $${\text{E}}(S)$$;

(ii)     $${\text{Var}}(S)$$.

[3]
b.

Let $$T = \frac{Y}{2} + \frac{Y}{2}$$.

(i)     Show that $$T$$ is an unbiased estimator for $$\lambda$$.

(ii)     Show that $$T$$ is a more efficient unbiased estimator of $$\lambda$$ than $$S$$.

[3]
c.

Could either $$S$$ or $$T$$ model a Poisson distribution? Justify your answer.

[1]
d.

By consideration of the probability generating function, $${G_{X + Y}}(t)$$, of $$X + Y$$, prove that $$X + Y$$ follows a Poisson distribution with mean $$2\lambda$$.

[3]
e.

Find

(i)     $${G_{X + Y}}(1)$$;

(ii)     $${G_{X + Y}}( – 1)$$.

[2]
f.

Hence find the probability that $$X + Y$$ is an even number.

[3]
g.

## Markscheme

(i)     $$G'(t) = \lambda {e^{\lambda (t – 1)}}$$     A1

$${\text{E}}(X) = G'(1)$$     M1

$$= \lambda$$     AG

(ii)     $$G”(t) = {\lambda ^2}{e^{\lambda (t – 1)}}$$     M1

$$\Rightarrow G”(1) = {\lambda ^2}$$     (A1)

$${\text{Var}}(X) = G”(1) + G'(1) – {\left( {G'(1)} \right)^2}$$     (M1)

$$= {\lambda ^2} + \lambda – {\lambda ^2}$$     A1

$$= \lambda$$     AG

[6 marks]

a.

(i)     $${\text{E}}(S) = 2\lambda – \lambda = \lambda$$     A1

(ii)     $${\text{Var}}(S) = 4\lambda + \lambda = 5\lambda$$     (A1)A1

Note:     First A1 can be awarded for either $$4\lambda$$ or $$\lambda$$.

[3 marks]

b.

(i)     $${\text{E}}(T) = \frac{\lambda }{2} + \frac{\lambda }{2} = \lambda \;\;\;$$(so $$T$$ is an unbiased estimator)     A1

(ii)     $${\text{Var}}(T) = \frac{1}{4}\lambda + \frac{1}{4}\lambda = \frac{1}{2}\lambda$$     A1

this is less than $${\text{Var}}(S)$$, therefore $$T$$ is the more efficient estimator     R1AG

Note:     Follow through their variances from (b)(ii) and (c)(ii).

[3 marks]

c.

no, mean does not equal the variance     R1

[1 mark]

d.

$${G_{X + Y}}(t) = {e^{\lambda (t – 1)}} \times {e^{\lambda (t – 1)}} = {e^{2\lambda (t – 1)}}$$     M1A1

which is the probability generating function for a Poisson with a mean of $$2\lambda$$     R1AG

[3 marks]

e.

(i)     $${G_{X + Y}}(1) = 1$$     A1

(ii)     $${G_{X + Y}}( – 1) = {e^{ – 4\lambda }}$$     A1

[2 marks]

f.

$${G_{X + Y}}(1) = p(0) + p(1) + p(2) + p(3) \ldots$$

$${G_{X + Y}}( – 1) = p(0) – p(1) + p(2) – p(3) \ldots$$

so $${\text{2P(even)}} = {G_{X + Y}}(1) + {G_{X + Y}}( – 1)$$     (M1)(A1)

$${\text{P(even)}} = \frac{1}{2}(1 + {e^{ – 4\lambda }})$$     A1

[3 marks]

Total [21 marks]

g.

## Examiners report

Solutions to the different parts of this question proved to be extremely variable in quality with some parts well answered by the majority of the candidates and other parts accessible to only a few candidates. Part (a) was well answered in general although the presentation was sometimes poor with some candidates doing the differentiation of $$G(t)$$ and the substitution of $$t = 1$$ simultaneously.

a.

Part (b) was well answered in general, the most common error being to state that $${\text{Var}}(2X – Y) = {\text{Var}}(2X) – {\text{Var}}(Y)$$.

b.

Parts (c) and (d) were well answered by the majority of candidates.

c.

Parts (c) and (d) were well answered by the majority of candidates.

d.

Solutions to (e), however, were extremely disappointing with few candidates giving correct solutions. A common incorrect solution was the following:

$$\;\;\;{G_{X + Y}}(t) = {G_X}(t){G_Y}(t)$$

Differentiating,

$$\;\;\;{G’_{X + Y}}(t) = {G’_X}(t){G_Y}(t) + {G_X}(t){G’_Y}(t)$$

$$\;\;\;{\text{E}}(X + Y) = {G’_{X + Y}}(1) = {\text{E}}(X) \times 1 + {\text{E}}(Y) \times 1 = 2\lambda$$

This is correct mathematics but it does not show that $$X + Y$$ is Poisson and it was given no credit. Even the majority of candidates who showed that $${G_{X + Y}}(t) = {{\text{e}}^{2\lambda (t – 1)}}$$ failed to state that this result proved that $$X + Y$$ is Poisson and they usually differentiated this function to show that $${\text{E}}(X + Y) = 2\lambda$$.

e.

In (f), most candidates stated that $${G_{X + Y}}(1) = 1$$ even if they were unable to determine $${G_{X + Y}}(t)$$ but many candidates were unable to evaluate $${G_{X + Y}}( – 1)$$. Very few correct solutions were seen to (g) even if the candidates correctly evaluated $${G_{X + Y}}(1)$$ and $${G_{X + Y}}( – 1)$$.

f.

[N/A]

g.

## Question

Determine the probability generating function for $$X \sim {\text{B}}(1,{\text{ }}p)$$.

[4]
a.

Explain why the probability generating function for $${\text{B}}(n,{\text{ }}p)$$ is a polynomial of degree $$n$$.

[2]
b.

Two independent random variables $${X_1}$$ and $${X_2}$$ are such that $${X_1} \sim {\text{B}}(1,{\text{ }}{p_1})$$ and $${X_2} \sim {\text{B}}(1,{\text{ }}{p_2})$$. Prove that if $${X_1} + {X_2}$$ has a binomial distribution then $${p_1} = {p_2}$$.

[5]
c.

## Markscheme

$${\text{P}}(X = 0) = 1 – p( = q);{\text{ P}}(X = 1) = p$$     (M1)(A1)

$${{\text{G}}_x}(t) = \sum\limits_r {{\text{P}}(X = r){t^r}\;\;\;}$$(or writing out term by term)     M1

$$= q + pt$$     A1

[4 marks]

a.

METHOD 1

$$PGF$$ for $$B(n,{\text{ }}p)$$ is $${(q + pt)^n}$$     R1

which is a polynomial of degree $$n$$     R1

METHOD 2

in $$n$$ independent trials, it is not possible to obtain more than $$n$$ successes (or equivalent, eg, $${\text{P}}(X > n) = 0$$)     R1

so $${a_r} = 0$$ for $$r > n$$     R1

[2 marks]

b.

let $$Y = {X_1} + {X_2}$$

$${G_Y}(t) = ({q_1} + {p_1}t)({q_2} + {p_2}t)$$     A1

$${G_Y}(t)$$ has degree two, so if $$Y$$ is binomial then

$$Y \sim {\text{B}}(2,{\text{ }}p)$$ for some $$p$$     R1

$${(q + pt)^2} = ({q_1} + {p_1}t)({q_2} + {p_2}t)$$     A1

Note:     The $$LHS$$ could be seen as $${q^2} + 2pqt + {p^2}{t^2}$$.

METHOD 1

by considering the roots of both sides, $$\frac{{{q_1}}}{{{p_1}}} = \frac{{{q_2}}}{{{p_2}}}$$     M1

$$\frac{{1 – {p_1}}}{{{p_1}}} = \frac{{1 – {p_2}}}{{{p_2}}}$$     A1

so $${p_1} = {p_2}$$     AG

METHOD 2

equating coefficients,

$${p_1}{p_2} = {p^2},{\text{ }}{q_1}{q_2} = {q^2}{\text{ or }}(1 – {p_1})(1 – {p_2}) = {(1 – p)^2}$$     M1

expanding,

$${p_1} + {p_2} = 2p$$ so $${p_1},{\text{ }}{p_2}$$ are the roots of $${x^2} – 2px + {p^2} = 0$$     A1

so $${p_1} = {p_2}$$     AG

[5 marks]

Total [11 marks]

c.

## Examiners report

Solutions to (a) were often disappointing with some candidates simply writing down the answer. A common error was to forget the possibility of $$X$$ being zero so that $$G(t) = pt$$ was often seen.

a.

Explanations in (b) were often poor, again indicating a lack of ability to give a verbal explanation.

b.

Very few complete solutions to (c) were seen with few candidates even reaching the result that $$({q_1} + {p_1}t)({q_2} + {p_2}t)$$ must equal $${(q + pt)^2}$$ for some $$p$$.

c.

## Question

Determine the probability generating function for $$X \sim {\text{B}}(1,{\text{ }}p)$$.

[4]
a.

Explain why the probability generating function for $${\text{B}}(n,{\text{ }}p)$$ is a polynomial of degree $$n$$.

[2]
b.

Two independent random variables $${X_1}$$ and $${X_2}$$ are such that $${X_1} \sim {\text{B}}(1,{\text{ }}{p_1})$$ and $${X_2} \sim {\text{B}}(1,{\text{ }}{p_2})$$. Prove that if $${X_1} + {X_2}$$ has a binomial distribution then $${p_1} = {p_2}$$.

[5]
c.

## Markscheme

$${\text{P}}(X = 0) = 1 – p( = q);{\text{ P}}(X = 1) = p$$     (M1)(A1)

$${{\text{G}}_x}(t) = \sum\limits_r {{\text{P}}(X = r){t^r}\;\;\;}$$(or writing out term by term)     M1

$$= q + pt$$     A1

[4 marks]

a.

METHOD 1

$$PGF$$ for $$B(n,{\text{ }}p)$$ is $${(q + pt)^n}$$     R1

which is a polynomial of degree $$n$$     R1

METHOD 2

in $$n$$ independent trials, it is not possible to obtain more than $$n$$ successes (or equivalent, eg, $${\text{P}}(X > n) = 0$$)     R1

so $${a_r} = 0$$ for $$r > n$$     R1

[2 marks]

b.

let $$Y = {X_1} + {X_2}$$

$${G_Y}(t) = ({q_1} + {p_1}t)({q_2} + {p_2}t)$$     A1

$${G_Y}(t)$$ has degree two, so if $$Y$$ is binomial then

$$Y \sim {\text{B}}(2,{\text{ }}p)$$ for some $$p$$     R1

$${(q + pt)^2} = ({q_1} + {p_1}t)({q_2} + {p_2}t)$$     A1

Note:     The $$LHS$$ could be seen as $${q^2} + 2pqt + {p^2}{t^2}$$.

METHOD 1

by considering the roots of both sides, $$\frac{{{q_1}}}{{{p_1}}} = \frac{{{q_2}}}{{{p_2}}}$$     M1

$$\frac{{1 – {p_1}}}{{{p_1}}} = \frac{{1 – {p_2}}}{{{p_2}}}$$     A1

so $${p_1} = {p_2}$$     AG

METHOD 2

equating coefficients,

$${p_1}{p_2} = {p^2},{\text{ }}{q_1}{q_2} = {q^2}{\text{ or }}(1 – {p_1})(1 – {p_2}) = {(1 – p)^2}$$     M1

expanding,

$${p_1} + {p_2} = 2p$$ so $${p_1},{\text{ }}{p_2}$$ are the roots of $${x^2} – 2px + {p^2} = 0$$     A1

so $${p_1} = {p_2}$$     AG

[5 marks]

Total [11 marks]

c.

## Examiners report

Solutions to (a) were often disappointing with some candidates simply writing down the answer. A common error was to forget the possibility of $$X$$ being zero so that $$G(t) = pt$$ was often seen.

a.

Explanations in (b) were often poor, again indicating a lack of ability to give a verbal explanation.

b.

Very few complete solutions to (c) were seen with few candidates even reaching the result that $$({q_1} + {p_1}t)({q_2} + {p_2}t)$$ must equal $${(q + pt)^2}$$ for some $$p$$.

c.

## Question

Determine the probability generating function for $$X \sim {\text{B}}(1,{\text{ }}p)$$.

[4]
a.

Explain why the probability generating function for $${\text{B}}(n,{\text{ }}p)$$ is a polynomial of degree $$n$$.

[2]
b.

Two independent random variables $${X_1}$$ and $${X_2}$$ are such that $${X_1} \sim {\text{B}}(1,{\text{ }}{p_1})$$ and $${X_2} \sim {\text{B}}(1,{\text{ }}{p_2})$$. Prove that if $${X_1} + {X_2}$$ has a binomial distribution then $${p_1} = {p_2}$$.

[5]
c.

## Markscheme

$${\text{P}}(X = 0) = 1 – p( = q);{\text{ P}}(X = 1) = p$$     (M1)(A1)

$${{\text{G}}_x}(t) = \sum\limits_r {{\text{P}}(X = r){t^r}\;\;\;}$$(or writing out term by term)     M1

$$= q + pt$$     A1

[4 marks]

a.

METHOD 1

$$PGF$$ for $$B(n,{\text{ }}p)$$ is $${(q + pt)^n}$$     R1

which is a polynomial of degree $$n$$     R1

METHOD 2

in $$n$$ independent trials, it is not possible to obtain more than $$n$$ successes (or equivalent, eg, $${\text{P}}(X > n) = 0$$)     R1

so $${a_r} = 0$$ for $$r > n$$     R1

[2 marks]

b.

let $$Y = {X_1} + {X_2}$$

$${G_Y}(t) = ({q_1} + {p_1}t)({q_2} + {p_2}t)$$     A1

$${G_Y}(t)$$ has degree two, so if $$Y$$ is binomial then

$$Y \sim {\text{B}}(2,{\text{ }}p)$$ for some $$p$$     R1

$${(q + pt)^2} = ({q_1} + {p_1}t)({q_2} + {p_2}t)$$     A1

Note:     The $$LHS$$ could be seen as $${q^2} + 2pqt + {p^2}{t^2}$$.

METHOD 1

by considering the roots of both sides, $$\frac{{{q_1}}}{{{p_1}}} = \frac{{{q_2}}}{{{p_2}}}$$     M1

$$\frac{{1 – {p_1}}}{{{p_1}}} = \frac{{1 – {p_2}}}{{{p_2}}}$$     A1

so $${p_1} = {p_2}$$     AG

METHOD 2

equating coefficients,

$${p_1}{p_2} = {p^2},{\text{ }}{q_1}{q_2} = {q^2}{\text{ or }}(1 – {p_1})(1 – {p_2}) = {(1 – p)^2}$$     M1

expanding,

$${p_1} + {p_2} = 2p$$ so $${p_1},{\text{ }}{p_2}$$ are the roots of $${x^2} – 2px + {p^2} = 0$$     A1

so $${p_1} = {p_2}$$     AG

[5 marks]

Total [11 marks]

c.

## Examiners report

Solutions to (a) were often disappointing with some candidates simply writing down the answer. A common error was to forget the possibility of $$X$$ being zero so that $$G(t) = pt$$ was often seen.

a.

Explanations in (b) were often poor, again indicating a lack of ability to give a verbal explanation.

b.

Very few complete solutions to (c) were seen with few candidates even reaching the result that $$({q_1} + {p_1}t)({q_2} + {p_2}t)$$ must equal $${(q + pt)^2}$$ for some $$p$$.

c.

## Question

A discrete random variable $$U$$ follows a geometric distribution with $$p = \frac{1}{4}$$.

Find $$F(u)$$, the cumulative distribution function of $$U$$, for $$u = 1,{\text{ }}2,{\text{ }}3 \ldots$$

[3]
a.

Hence, or otherwise, find the value of $$P(U > 20)$$.

[2]
b.

Prove that the probability generating function of $$U$$ is given by $${G_u}(t) = \frac{t}{{4 – 3t}}$$.

[4]
c.

Given that $${U_i} \sim {\text{Geo}}\left( {\frac{1}{4}} \right),{\text{ }}i = 1,{\text{ }}2,{\text{ }}3$$, and that $$V = {U_1} + {U_2} + {U_3}$$, find

(i)     $${\text{E}}(V)$$;

(ii)     $${\text{Var}}(V)$$;

(iii)     $${G_v}(t)$$, the probability generating function of $$V$$.

[6]
d.

A third random variable $$W$$, has probability generating function $${G_w}(t) = \frac{1}{{{{(4 – 3t)}^3}}}$$.

By differentiating $${G_w}(t)$$, find $${\text{E}}(W)$$.

[4]
e.

A third random variable $$W$$, has probability generating function $${G_w}(t) = \frac{1}{{{{(4 – 3t)}^3}}}$$.

Prove that $$V = W + 3$$.

[3]
f.

## Markscheme

METHOD 1

$${\text{P}}(U = u) = \frac{1}{4}{\left( {\frac{3}{4}} \right)^{u – 1}}$$     (M1)

$$F(u) = {\text{P}}(U \le u) = \sum\limits_{r = 1}^u {\frac{1}{4}{{\left( {\frac{3}{4}} \right)}^{r – 1}}\;\;\;}$$(or equivalent)

$$= \frac{{\frac{1}{4}\left( {1 – {{\left( {\frac{3}{4}} \right)}^u}} \right)}}{{1 – \frac{3}{4}}}$$     (M1)

$$= 1 – {\left( {\frac{3}{4}} \right)^u}$$     A1

METHOD 2

$${\text{P}}(U \le u) = 1 – {\text{P}}(U > u)$$     (M1)

$${\text{P}}(U > u) =$$ probability of $$u$$ consecutive failures     (M1)

$${\text{P}}(U \le u) = 1 – {\left( {\frac{3}{4}} \right)^u}$$     A1

[3 marks]

a.

$${\text{P}}(U > 20) = 1 – {\text{P}}(U \le 20)$$     (M1)

$$= {\left( {\frac{3}{4}} \right)^{20}}\;\;\;( = 0.00317)$$     A1

[2 marks]

b.

$${G_U}(t) = \sum\limits_{r = 1}^\infty {\frac{1}{4}{{\left( {\frac{3}{4}} \right)}^{r – 1}}{t^r}\;\;\;}$$(or equivalent)     M1A1

$$= \sum\limits_{r = 1}^\infty {\frac{1}{3}{{\left( {\frac{3}{4}t} \right)}^r}}$$     (M1)

$$= \frac{{\frac{1}{3}\left( {\frac{3}{4}t} \right)}}{{1 – \frac{3}{4}t}}\;\;\;\left( { = \frac{{\frac{1}{4}t}}{{1 – \frac{3}{4}t}}} \right)$$     A1

$$= \frac{t}{{4 – 3t}}$$     AG

[4 marks]

c.

(i)     $$E(U) = \frac{1}{{\frac{1}{4}}} = 4$$     (A1)

$$E({U_1} + {U_2} + {U_3}{\text{)}} = 4 + 4 + 4 = 12$$     A1

(ii)     $${\text{Var}}(U) = \frac{{\frac{3}{4}}}{{{{\left( {\frac{1}{4}} \right)}^2}}}=12$$     A1

$${\text{Var(}}{U_1} + {U_2} + {U_3}) = 12 + 12 + 12 = 36$$     A1

(iii)     $${G_v}(t) = {\left( {{G_U}(t)} \right)^3}$$     (M1)

$$= {\left( {\frac{t}{{4 – 3t}}} \right)^3}$$     A1

[6 marks]

d.

$${G_W}^\prime (t) = – 3{(4 – 3t)^{ – 4}}( – 3)\;\;\;\left( { = \frac{9}{{{{(4 – 3t)}^4}}}} \right)$$     (M1)(A1)

$$E(W) = {G_W}^\prime (1) = 9$$     (M1)A1

Note:     Allow the use of the calculator to perform the differentiation.

[4 marks]

e.

EITHER

probability generating function of the constant 3 is $${t^3}$$     A1

OR

$${G_{W – 3}}(t) = E({t^{W + 3}}) = E({t^W})E({t^3})$$     A1

THEN

$$W + 3$$ has generating function $${G_{W + 3}} = \frac{1}{{{{(4 – 3t)}^3}}} \times {t^3} = {G_V}(t)$$     M1

as the generating functions are the same $$V = W + 3$$     R1AG

[3 marks]

Total [22 marks]

f.

[N/A]

a.

[N/A]

b.

[N/A]

c.

[N/A]

d.

[N/A]

e.

[N/A]

f.

## Question

The continuous random variable $$X$$ has probability density function

$f(x) = \left\{ {\begin{array}{*{20}{c}} {{{\text{e}}^{ – x}}}&{x \geqslant 0} \\ 0&{x < 0} \end{array}.} \right.$

The discrete random variable $$Y$$ is defined as the integer part of $$X$$, that is the largest integer less than or equal to $$X$$.

Show that the probability distribution of $$Y$$ is given by $${\text{P}}(Y = y) = {{\text{e}}^{ – y}}(1 – {{\text{e}}^{ – 1}}),{\text{ }}y \in \mathbb{N}$$.

[4]
a.

(i)     Show that $$G(t)$$, the probability generating function of $$Y$$, is given by $$G(t) = \frac{{1 – {{\text{e}}^{ – 1}}}}{{1 – {{\text{e}}^{ – 1}}t}}$$.

(ii)     Hence determine the value of $${\text{E}}(Y)$$ correct to three significant figures.

[8]
b.

## Markscheme

$${\text{P}}(Y = y) = \int_y^{y + 1} {{{\text{e}}^{ – x}}{\text{d}}x}$$    M1A1

$$= {[ – {{\text{e}}^{ – x}}]^{y + 1}}y$$    A1

$$= – {{\text{e}}^{ – (y + 1)}} + {{\text{e}}^{ – y}}$$    A1

$$= {{\text{e}}^{ – y}}(1 – {{\text{e}}^{ – 1}})$$    AG

[4 marks]

a.

(i)     attempt to use $$G(t) = \sum {{\text{P}}(Y = y){t^y}}$$     (M1)

$$= \sum\limits_{y = 0}^\infty {{{\text{e}}^{ – y}}(1 – {{\text{e}}^{ – 1}}){t^y}}$$    A1

Note:     Accept a listing of terms without the use of $$\Sigma$$.

this is an infinite geometric series with first term $$1 – {{\text{e}}^{ – 1}}$$ and common ratio $${{\text{e}}^{ – 1}}t$$     M1

$$G(t) = \frac{{1 – {{\text{e}}^{ – 1}}}}{{1 – {{\text{e}}^{ – 1}}t}}$$    AG

(ii)     $${\text{E}}(Y) = G'(1)$$     M1

$$G'(t) = \frac{{1 – {{\text{e}}^{ – 1}}}}{{{{(1 – {{\text{e}}^{ – 1}}t)}^2}}} \times {{\text{e}}^{ – 1}}$$     (M1)(A1)

$${\text{E}}(Y) = \frac{{{{\text{e}}^{ – 1}}}}{{(1 – {{\text{e}}^{ – 1}})}}$$    (A1)

$$= 0.582$$    A1

Note:     Allow the use of GDC to determine $$G'(1)$$.

[8 marks]

b.

## Examiners report

In (a), it was disappointing to find that very few candidates realised that $${\text{P}}(Y = y)$$ could be found by integrating $$f(x)$$ from $$y$$ to $$y + 1$$. Candidates who simply integrated $$f(x)$$ to find the cumulative distribution function of $$X$$ were given no credit unless they attempted to use their result to find the probability distribution of $$Y$$.

a.

Solutions to (b)(i) were generally good although marks were lost due to not including the $$y = 0$$ term.

Part (b)(ii) was also well answered in general with the majority of candidates using the GDC to evaluate $$G'(1)$$.

Candidates who tried to differentiate $$G(t)$$ algebraically often made errors.

b.

## Question

Two independent discrete random variables $$X$$ and $$Y$$ have probability generating functions $$G(t)$$ and $$H(t)$$ respectively. Let $$Z = X + Y$$ have probability generating function $$J(t)$$.

Write down an expression for $$J(t)$$ in terms of $$G(t)$$ and $$H(t)$$.

[1]
a.

By differentiating $$J(t)$$, prove that

(i)     $${\text{E}}(Z) = {\text{E}}(X) + {\text{E}}(Y)$$;

(ii)     $${\text{Var}}(Z) = {\text{Var}}(X) + {\text{Var}}(Y)$$.

[10]
b.

## Markscheme

$$J(t) = G(t)H(t)$$    A1

[1 mark]

a.

(i)     $$J'(t) = G'(t)H(t) + G(t)H'(t)$$     M1A1

$$J'(1) = G'(1)H(1) + G(1)H'(1)$$    M1

$$J'(1) = G'(1) + H'(1)$$    A1

so $$E(Z) = E(X) + E(Y)$$     AG

(ii)     $$J”(t) = G”(t)H(t) + G'(t)H'(t) + G'(t)H'(t) + G(t)H”(t)$$     M1A1

$$J”(1) = G”(1)H(1) + 2G'(1)H'(1) + G(1)H”(1)$$

$$= G”(1) + 2G'(1)H'(1) + H”(1)$$    A1

$${\text{Var}}(Z) = J”(1) + J'(1) – {\left( {J'(1)} \right)^2}$$    M1

$$= G”(1) + 2G'(1)H'(1) + H”(1) + G'(1) + H'(1) – {\left( {G'(1) + H'(1)} \right)^2}$$    A1

$$= G”(1) + G'(1) – {\left( {G'(1)} \right)^2} + H”(1) + H'(1) – {\left( {H'(1)} \right)^2}$$    A1

so $${\text{Var}}(Z) = {\text{Var}}(X) + {\text{Var}}(Y)$$     AG

Note: If addition is wrongly used instead of multiplication in (a) it is inappropriate to give FT apart from the second M marks in each part, as the working is too simple.

[10 marks]

b.

[N/A]

a.

[N/A]

b.

## Question

A continuous random variable $$T$$ has a probability density function defined by

$$f(t) = \left\{ {\begin{array}{*{20}{c}} {\frac{{t(4 – {t^2})}}{4}}&{0 \leqslant t \leqslant 2} \\ {0,}&{{\text{otherwise}}} \end{array}} \right.$$.

Find the cumulative distribution function $$F(t)$$, for $$0 \leqslant t \leqslant 2$$.

[3]
a.

Sketch the graph of $$F(t)$$ for $$0 \leqslant t \leqslant 2$$, clearly indicating the coordinates of the endpoints.

[2]
b.i.

Given that $$P(T < a) = 0.75$$, find the value of $$a$$.

[2]
b.ii.

## Markscheme

$$F(t) = \int_0^t {\left( {x – \frac{{{x^3}}}{4}} \right){\text{d}}x{\text{ }}\left( { = \int_0^t {\frac{{x(4 – {x^2})}}{4}{\text{d}}x} } \right)}$$     M1

$$= \left[ {\frac{{{x^2}}}{2} – \frac{{{x^4}}}{{16}}} \right]_0^t{\text{ }}\left( { = \left[ {\frac{{{x^2}(8 – {x^2})}}{{16}}} \right]_0^t} \right){\text{ }}\left( { = \left[ {\frac{{ – 4 – {x^2}{)^2}}}{{16}}} \right]_0^t} \right)$$     A1

$$= \frac{{{t^2}}}{2} – \frac{{{t^4}}}{{16}}{\text{ }}\left( { = \frac{{{t^2}(8 – {t^2})}}{{16}}} \right){\text{ }}\left( { = 1 – \frac{{{{(4 – {t^2})}^2}}}{{16}}} \right)$$     A1

Note:     Condone integration involving $$t$$ only.

Note:     Award M1A0A0 for integration without limits eg, $$\int {\frac{{t(4 – {t^2})}}{4}{\text{d}}t = \frac{{{t^2}}}{2} – \frac{{{t^4}}}{{16}}}$$ or equivalent.

Note:     But allow integration $$+$$ $$C$$ then showing $$C = 0$$ or even integration without $$C$$ if $$F(0) = 0$$ or $$F(2) = 1$$ is confirmed.

[3 marks]

a.

correct shape including correct concavity     A1

clearly indicating starts at origin and ends at $$(2,{\text{ }}1)$$     A1

Note:     Condone the absence of $$(0,{\text{ }}0)$$.

Note:     Accept 2 on the $$x$$-axis and 1 on the $$y$$-axis correctly placed.

[2 marks]

b.i.

attempt to solve $$\frac{{{a^2}}}{2} – \frac{{{a^4}}}{{16}} = 0.75$$ (or equivalent) for $$a$$     (M1)

$$a = 1.41{\text{ }}( = \sqrt 2 )$$     A1

Note:     Accept any answer that rounds to 1.4.

[2 marks]

b.ii.

[N/A]

a.

[N/A]

b.i.

[N/A]

b.ii.

## Question

A continuous random variable $$T$$ has a probability density function defined by

$$f(t) = \left\{ {\begin{array}{*{20}{c}} {\frac{{t(4 – {t^2})}}{4}}&{0 \leqslant t \leqslant 2} \\ {0,}&{{\text{otherwise}}} \end{array}} \right.$$.

Find the cumulative distribution function $$F(t)$$, for $$0 \leqslant t \leqslant 2$$.

[3]
a.

Sketch the graph of $$F(t)$$ for $$0 \leqslant t \leqslant 2$$, clearly indicating the coordinates of the endpoints.

[2]
b.i.

Given that $$P(T < a) = 0.75$$, find the value of $$a$$.

[2]
b.ii.

## Markscheme

$$F(t) = \int_0^t {\left( {x – \frac{{{x^3}}}{4}} \right){\text{d}}x{\text{ }}\left( { = \int_0^t {\frac{{x(4 – {x^2})}}{4}{\text{d}}x} } \right)}$$     M1

$$= \left[ {\frac{{{x^2}}}{2} – \frac{{{x^4}}}{{16}}} \right]_0^t{\text{ }}\left( { = \left[ {\frac{{{x^2}(8 – {x^2})}}{{16}}} \right]_0^t} \right){\text{ }}\left( { = \left[ {\frac{{ – 4 – {x^2}{)^2}}}{{16}}} \right]_0^t} \right)$$     A1

$$= \frac{{{t^2}}}{2} – \frac{{{t^4}}}{{16}}{\text{ }}\left( { = \frac{{{t^2}(8 – {t^2})}}{{16}}} \right){\text{ }}\left( { = 1 – \frac{{{{(4 – {t^2})}^2}}}{{16}}} \right)$$     A1

Note:     Condone integration involving $$t$$ only.

Note:     Award M1A0A0 for integration without limits eg, $$\int {\frac{{t(4 – {t^2})}}{4}{\text{d}}t = \frac{{{t^2}}}{2} – \frac{{{t^4}}}{{16}}}$$ or equivalent.

Note:     But allow integration $$+$$ $$C$$ then showing $$C = 0$$ or even integration without $$C$$ if $$F(0) = 0$$ or $$F(2) = 1$$ is confirmed.

[3 marks]

a.

correct shape including correct concavity     A1

clearly indicating starts at origin and ends at $$(2,{\text{ }}1)$$     A1

Note:     Condone the absence of $$(0,{\text{ }}0)$$.

Note:     Accept 2 on the $$x$$-axis and 1 on the $$y$$-axis correctly placed.

[2 marks]

b.i.

attempt to solve $$\frac{{{a^2}}}{2} – \frac{{{a^4}}}{{16}} = 0.75$$ (or equivalent) for $$a$$     (M1)

$$a = 1.41{\text{ }}( = \sqrt 2 )$$     A1

Note:     Accept any answer that rounds to 1.4.

[2 marks]

b.ii.

[N/A]

a.

[N/A]

b.i.

[N/A]

b.ii.

## Question

The random variable $$X$$ follows a Poisson distribution with mean $$\lambda$$. The probability generating function of $$X$$ is given by $${G_X}(t) = {{\text{e}}^{\lambda (t – 1)}}$$.

The random variable $$Y$$, independent of $$X$$, follows a Poisson distribution with mean $$\mu$$.

Find expressions for $${G’_X}(t)$$ and $${G’’_X}(t)$$.

[2]
a.i.

Hence show that $${\text{Var}}(X) = \lambda$$.

[3]
a.ii.

By considering the probability generating function, $${G_{X + Y}}(t)$$, of $$X + Y$$, show that $$X + Y$$ follows a Poisson distribution with mean $$\lambda + \mu$$.

[3]
b.

Show that $${\text{P}}(X = x|X + Y = n) = \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right){\left( {\frac{\lambda }{{\lambda + \mu }}} \right)^x}{\left( {1 – \frac{\lambda }{{\lambda + \mu }}} \right)^{n – x}}$$, where $$n$$, $$x$$ are non-negative integers and $$n \geqslant x$$.

[5]
c.i.

Identify the probability distribution given in part (c)(i) and state its parameters.

[2]
c.ii.

## Markscheme

$${G’_X}(t) = \lambda {{\text{e}}^{\lambda (t – 1)}}$$     A1

$${G’’_X}(t) = {\lambda ^2}{{\text{e}}^{\lambda (t – 1)}}$$     A1

[2 marks]

a.i.

$${\text{Var}}(X) = {G”_X}(1) + {G’_X}(1) – {\left( {{{G’}_X}(1)} \right)^2}$$     (M1)

$${G’_X}(1) = \lambda$$ and $${G’’_X}(1) = {\lambda ^2}$$     (A1)

$${\text{Var}}(X) = {\lambda ^2} + \lambda – {\lambda ^2}$$     A1

$$= \lambda$$     AG

[3 marks]

a.ii.

$${G_{X + Y}}(t) = {{\text{e}}^{\lambda (t – 1)}} \times {{\text{e}}^{\mu (t – 1)}}$$     M1

Note:     The M1 is for knowing to multiply pgfs.

$$= {{\text{e}}^{(\lambda + \mu )(t – 1)}}$$     A1

which is the pgf for a Poisson distribution with mean $$\lambda + \mu$$     R1AG

Note:     Line 3 identifying the Poisson pgf must be seen.

[3 marks]

b.

$${\text{P}}(X = x|X + Y = n) = \frac{{{\text{P}}(X = x \cap Y = n – x)}}{{{\text{P}}(X + Y = n)}}$$     (M1)

$$= \left( {\frac{{{{\text{e}}^{ – \lambda }}{\lambda ^x}}}{{x!}}} \right)\left( {\frac{{{{\text{e}}^{ – \mu }}{\mu ^{n – x}}}}{{(n – x)!}}} \right)\left( {\frac{{n!}}{{{{\text{e}}^{ – (\lambda + \mu )}}{{(\lambda + \mu )}^n}}}} \right)$$ (or equivalent)     M1A1

$$= \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right)\frac{{{\lambda ^x}{\mu ^{n – x}}}}{{{{(\lambda + \mu )}^n}}}$$     A1

$$= \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right){\left( {\frac{\lambda }{{\lambda + \mu }}} \right)^x}{\left( {\frac{\mu }{{\lambda + \mu }}} \right)^{n – x}}$$     A1

leading to $${\text{P}}(X = x|X + Y = n) = \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right){\left( {\frac{\lambda }{{\lambda + \mu }}} \right)^x}{\left( {1 – \frac{\lambda }{{\lambda + \mu }}} \right)^{n – x}}$$     AG

[5 marks]

c.i.

$${\text{B}}\left( {n,{\text{ }}\frac{\lambda }{{\lambda + \mu }}} \right)$$     A1A1

Note:     Award A1 for stating binomial and A1 for stating correct parameters.

[2 marks]

c.ii.

[N/A]

a.i.

[N/A]

a.ii.

[N/A]

b.

[N/A]

c.i.

[N/A]

c.ii.

## Question

The random variable $$X$$ follows a Poisson distribution with mean $$\lambda$$. The probability generating function of $$X$$ is given by $${G_X}(t) = {{\text{e}}^{\lambda (t – 1)}}$$.

The random variable $$Y$$, independent of $$X$$, follows a Poisson distribution with mean $$\mu$$.

Find expressions for $${G’_X}(t)$$ and $${G’’_X}(t)$$.

[2]
a.i.

Hence show that $${\text{Var}}(X) = \lambda$$.

[3]
a.ii.

By considering the probability generating function, $${G_{X + Y}}(t)$$, of $$X + Y$$, show that $$X + Y$$ follows a Poisson distribution with mean $$\lambda + \mu$$.

[3]
b.

Show that $${\text{P}}(X = x|X + Y = n) = \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right){\left( {\frac{\lambda }{{\lambda + \mu }}} \right)^x}{\left( {1 – \frac{\lambda }{{\lambda + \mu }}} \right)^{n – x}}$$, where $$n$$, $$x$$ are non-negative integers and $$n \geqslant x$$.

[5]
c.i.

Identify the probability distribution given in part (c)(i) and state its parameters.

[2]
c.ii.

## Markscheme

$${G’_X}(t) = \lambda {{\text{e}}^{\lambda (t – 1)}}$$     A1

$${G’’_X}(t) = {\lambda ^2}{{\text{e}}^{\lambda (t – 1)}}$$     A1

[2 marks]

a.i.

$${\text{Var}}(X) = {G”_X}(1) + {G’_X}(1) – {\left( {{{G’}_X}(1)} \right)^2}$$     (M1)

$${G’_X}(1) = \lambda$$ and $${G’’_X}(1) = {\lambda ^2}$$     (A1)

$${\text{Var}}(X) = {\lambda ^2} + \lambda – {\lambda ^2}$$     A1

$$= \lambda$$     AG

[3 marks]

a.ii.

$${G_{X + Y}}(t) = {{\text{e}}^{\lambda (t – 1)}} \times {{\text{e}}^{\mu (t – 1)}}$$     M1

Note:     The M1 is for knowing to multiply pgfs.

$$= {{\text{e}}^{(\lambda + \mu )(t – 1)}}$$     A1

which is the pgf for a Poisson distribution with mean $$\lambda + \mu$$     R1AG

Note:     Line 3 identifying the Poisson pgf must be seen.

[3 marks]

b.

$${\text{P}}(X = x|X + Y = n) = \frac{{{\text{P}}(X = x \cap Y = n – x)}}{{{\text{P}}(X + Y = n)}}$$     (M1)

$$= \left( {\frac{{{{\text{e}}^{ – \lambda }}{\lambda ^x}}}{{x!}}} \right)\left( {\frac{{{{\text{e}}^{ – \mu }}{\mu ^{n – x}}}}{{(n – x)!}}} \right)\left( {\frac{{n!}}{{{{\text{e}}^{ – (\lambda + \mu )}}{{(\lambda + \mu )}^n}}}} \right)$$ (or equivalent)     M1A1

$$= \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right)\frac{{{\lambda ^x}{\mu ^{n – x}}}}{{{{(\lambda + \mu )}^n}}}$$     A1

$$= \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right){\left( {\frac{\lambda }{{\lambda + \mu }}} \right)^x}{\left( {\frac{\mu }{{\lambda + \mu }}} \right)^{n – x}}$$     A1

leading to $${\text{P}}(X = x|X + Y = n) = \left( {\begin{array}{*{20}{c}} n \\ x \end{array}} \right){\left( {\frac{\lambda }{{\lambda + \mu }}} \right)^x}{\left( {1 – \frac{\lambda }{{\lambda + \mu }}} \right)^{n – x}}$$     AG

[5 marks]

c.i.

$${\text{B}}\left( {n,{\text{ }}\frac{\lambda }{{\lambda + \mu }}} \right)$$     A1A1

Note:     Award A1 for stating binomial and A1 for stating correct parameters.

[2 marks]

c.ii.

[N/A]

a.i.

[N/A]

a.ii.

[N/A]

b.

[N/A]

c.i.

[N/A]

c.ii.

## Question

Consider an unbiased tetrahedral (four-sided) die with faces labelled 1, 2, 3 and 4 respectively.

The random variable X represents the number of throws required to obtain a 1.

State the distribution of X.

[1]
a.

Show that the probability generating function, $$G\left( t \right)$$, for X is given by $$G\left( t \right) = \frac{t}{{4 – 3t}}$$.

[4]
b.

Find $$G’\left( t \right)$$.

[2]
c.

Determine the mean number of throws required to obtain a 1.

[1]
d.

## Markscheme

X is geometric (or negative binomial)      A1

[1 mark]

a.

$$G\left( t \right) = \frac{1}{4}t + \frac{1}{4}\left( {\frac{3}{4}} \right){t^2} + \frac{1}{4}{\left( {\frac{3}{4}} \right)^2}{t^3} + \ldots$$     M1A1

recognition of GP $$\left( {{u_1} = \frac{1}{4}t,\,\,r = \frac{3}{4}t} \right)$$     (M1)

$$= \frac{{\frac{1}{4}t}}{{1 – \frac{3}{4}t}}$$     A1

leading to $$G\left( t \right) = \frac{t}{{4 – 3t}}$$     AG

[4 marks]

b.

attempt to use product or quotient rule      M1

$$G’\left( t \right) = \frac{4}{{{{\left( {4 – 3t} \right)}^2}}}$$     A1

[2 marks]

c.

4      A1

Note: Award A1FT to a candidate that correctly calculates the value of $$G’\left( 1 \right)$$ from their $$G’\left( t \right)$$.

[1 mark]

d.

[N/A]

a.

[N/A]

b.

[N/A]

c.

[N/A]

d.

## Question

When Andrew throws a dart at a target, the probability that he hits it is $$\frac{1}{3}$$ ; when Bill throws a dart at the target, the probability that he hits the it is $$\frac{1}{4}$$ . Successive throws are independent. One evening, they throw darts at the target alternately, starting with Andrew, and stopping as soon as one of their darts hits the target. Let X denote the total number of darts thrown.

Write down the value of $${\text{P}}(X = 1)$$ and show that $${\text{P}}(X = 2) = \frac{1}{6}$$.

[2]
a.

Show that the probability generating function for X is given by

$G(t) = \frac{{2t + {t^2}}}{{6 – 3{t^2}}}.$

[6]
b.

Hence determine $${\text{E}}(X)$$.

[4]
c.

## Markscheme

$${\text{P}}(X = 1) = \frac{1}{3}$$     A1

$${\text{P}}(X = 2) = \frac{2}{3} \times \frac{1}{4}$$     A1

$$= \frac{1}{6}$$     AG

[2 marks]

a.

$$G(t) = \frac{1}{3}t + \frac{2}{3} \times \frac{1}{4}{t^2} + \frac{2}{3} \times \frac{3}{4} \times \frac{1}{3}{t^3} + \frac{2}{3} \times \frac{3}{4} \times \frac{2}{3} \times \frac{1}{4}{t^4} + \ldots$$     M1A1

$$= \frac{1}{3}t\left( {1 + \frac{1}{2}{t^2} + \ldots } \right) + \frac{1}{6}{t^2}\left( {1 + \frac{1}{2}{t^2} + \ldots } \right)$$     M1A1

$$= \frac{{\frac{t}{3}}}{{1 – \frac{{{t^2}}}{2}}} + \frac{{\frac{{{t^2}}}{6}}}{{1 – \frac{{{t^2}}}{2}}}$$     A1A1

$$= \frac{{2t + {t^2}}}{{6 – 3{t^2}}}$$     AG

[6 marks]

b.

$$G'(t) = \frac{{(2 + 2t)(6 – 3{t^2}) + 6t(2t + {t^2})}}{{{{(6 – 3{t^2})}^2}}}$$     M1A1

$${\text{E}}(X) = G'(1) = \frac{{10}}{3}$$     M1A1

[4 marks]

c.

[N/A]

a.

[N/A]

b.

[N/A]

c.

## Question

When Andrew throws a dart at a target, the probability that he hits it is $$\frac{1}{3}$$ ; when Bill throws a dart at the target, the probability that he hits the it is $$\frac{1}{4}$$ . Successive throws are independent. One evening, they throw darts at the target alternately, starting with Andrew, and stopping as soon as one of their darts hits the target. Let X denote the total number of darts thrown.

Write down the value of $${\text{P}}(X = 1)$$ and show that $${\text{P}}(X = 2) = \frac{1}{6}$$.

[2]
a.

Show that the probability generating function for X is given by

$G(t) = \frac{{2t + {t^2}}}{{6 – 3{t^2}}}.$

[6]
b.

Hence determine $${\text{E}}(X)$$.

[4]
c.

## Markscheme

$${\text{P}}(X = 1) = \frac{1}{3}$$     A1

$${\text{P}}(X = 2) = \frac{2}{3} \times \frac{1}{4}$$     A1

$$= \frac{1}{6}$$     AG

[2 marks]

a.

$$G(t) = \frac{1}{3}t + \frac{2}{3} \times \frac{1}{4}{t^2} + \frac{2}{3} \times \frac{3}{4} \times \frac{1}{3}{t^3} + \frac{2}{3} \times \frac{3}{4} \times \frac{2}{3} \times \frac{1}{4}{t^4} + \ldots$$     M1A1

$$= \frac{1}{3}t\left( {1 + \frac{1}{2}{t^2} + \ldots } \right) + \frac{1}{6}{t^2}\left( {1 + \frac{1}{2}{t^2} + \ldots } \right)$$     M1A1

$$= \frac{{\frac{t}{3}}}{{1 – \frac{{{t^2}}}{2}}} + \frac{{\frac{{{t^2}}}{6}}}{{1 – \frac{{{t^2}}}{2}}}$$     A1A1

$$= \frac{{2t + {t^2}}}{{6 – 3{t^2}}}$$     AG

[6 marks]

b.

$$G'(t) = \frac{{(2 + 2t)(6 – 3{t^2}) + 6t(2t + {t^2})}}{{{{(6 – 3{t^2})}^2}}}$$     M1A1

$${\text{E}}(X) = G'(1) = \frac{{10}}{3}$$     M1A1

[4 marks]

c.

[N/A]

a.

[N/A]

b.

[N/A]

c.