数学代写|CS/ECE-4501/6501
AUTHOR
essaygo
PUBLISHED ON:
2022年9月21日
PUBLISHED IN:

这是一篇来自美国的与函数相关的数学代写

 

Problem Set I

  1. (20%) Prove the properties of convolution. For all continuous function f, g, and h, the following axioms hold. Please make sure to lay out all key steps of your proof for full credits.
  • Associativity: (f g) h = f (g h)
  • Distributivity: f (g + h) = f g + f h
  • Difffferentiation rule: (f g)0 = f0 g = f g0
  • Convolution theorem: F(g h) = F(g)F(h), where F denotes Fourier transform
  1. (25%) Frequency smoothing:

(a) Compute Fourier transform of the given image lenaNoise.PNG by using fft2 function in Matlab (or numpy.fft.fft2 in Python), and then center the low frequencies (e.g., byusing fftshift).

(b) Keep difffferent number of low frequencies (e.g., 102, 202, 402 and the full dimension), but set all other high frequencies to 0.

(c) Reconstruct the original image (ifft2) by using the new generated frequencies in step (b).

Submit the code and include the restored images with difffferent number of low frequencies in your report.

  1. (55%) Implement gradient descent algorithm for ROF model with total variation minimization. All codes and a two-page report including problem description, a concrete  optimization algorithm, and experimental results (a denoised image and a convergence graph that generated by your best-tuned parameters) with discussions should be submitted.

NOTE that

  • Test your program with difffferent Gaussian noises (σ = 0.01, 0.05, 0.1) and include all results in your report. A matlab/python function of ’GenerateNoiseImage’ is given for your reference.
  • The forward / backward difffference for computing image gradient is given in Dx / Dxt.

Feel free to use it or write your own.

  • A detailed class note of deriving total variation, computing gradient term, and gradient descent algorithm can be downloaded from Collab.
You may also like:
扫描二维码或者
添加微信skygpa