The diagram below shows an analogue to digital converter that samples an analogue signal,m(t), and generates a serial digital bit stream. a) What is the minimum sample rate that the sampler can use to ensure the analogue signal-can be recreated without distortion?[2] b) If the system operates at the minimum sample rate, draw the amplitude spectrum of the signal at the output of the sampler.[4] c) The quantiser converts analogue samples to digital samples. Why is there an optimum number of bits per sample?[6] d) If the encoded digital bit stream, d(1), is to be transmitted over a CAT-5 cable, draw anexample line code that could be used, explaining how 1 and 0 bits are encoded. [4] e) When operating at the minimum sample rate, if the quantiser uses 10 bits per sample,what is the bit rate of the encoded data signal (no data compression used)? [1]

Fig: 1

Fig: 2

Fig: 3

Fig: 4

Fig: 5

Fig: 6

Fig: 7