Sophie

Sophie

distrib > Mageia > 5 > i586 > by-pkgid > 37ce2601040f8edc2329d4714238376a > files > 4035

eso-midas-doc-13SEPpl1.2-3.mga5.i586.rpm

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<!--Converted with LaTeX2HTML 98.1p1 release (March 2nd, 1998)
originally by Nikos Drakos (nikos@cbl.leeds.ac.uk), CBLU, University of Leeds
* revised and updated by:  Marcus Hennecke, Ross Moore, Herb Swan
* with significant contributions from:
  Jens Lippmann, Marek Rouchal, Martin Wilck and others -->
<HTML>
<HEAD>
<TITLE>The Wiener-like filtering in the wavelet space</TITLE>
<META NAME="description" CONTENT="The Wiener-like filtering in the wavelet space">
<META NAME="keywords" CONTENT="vol2">
<META NAME="resource-type" CONTENT="document">
<META NAME="distribution" CONTENT="global">
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1">
<LINK REL="STYLESHEET" HREF="vol2.css">
<LINK REL="next" HREF="node331.html">
<LINK REL="previous" HREF="node329.html">
<LINK REL="up" HREF="node328.html">
<LINK REL="next" HREF="node331.html">
</HEAD>
<BODY >
<!--Navigation Panel-->
<A NAME="tex2html5567"
 HREF="node331.html">
<IMG WIDTH="37" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="next"
 SRC="icons.gif/next_motif.gif"></A> 
<A NAME="tex2html5564"
 HREF="node328.html">
<IMG WIDTH="26" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="up"
 SRC="icons.gif/up_motif.gif"></A> 
<A NAME="tex2html5558"
 HREF="node329.html">
<IMG WIDTH="63" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="previous"
 SRC="icons.gif/previous_motif.gif"></A> 
<A NAME="tex2html5566"
 HREF="node1.html">
<IMG WIDTH="65" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="contents"
 SRC="icons.gif/contents_motif.gif"></A>  
<BR>
<B> Next:</B> <A NAME="tex2html5568"
 HREF="node331.html">Hierarchical Wiener filtering</A>
<B> Up:</B> <A NAME="tex2html5565"
 HREF="node328.html">Noise reduction from the</A>
<B> Previous:</B> <A NAME="tex2html5559"
 HREF="node329.html">The convolution from the</A>
<BR>
<BR>
<!--End of Navigation Panel-->

<H2><A NAME="SECTION002062000000000000000">&#160;</A>
<A NAME="sec_filt_4">&#160;</A>
<BR>
The Wiener-like filtering in the wavelet space
</H2>
Let us consider a measured wavelet coefficient <I>w</I><SUB><I>i</I></SUB> at the scale <I>i</I>. 
We assume
that its value, at a given scale and a given position,
 results from a noisy process, with a Gaussian distribution with a 
mathematical expectation <I>W</I><SUB><I>i</I></SUB>, and a standard deviation <I>B</I><SUB><I>i</I></SUB>:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
P(w_i/W_i)  =  \frac{1}{\sqrt{2\pi}B_i}  e^{- \frac{(w_i-W_i)^2} {2B_i^2}}
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="298" HEIGHT="91" ALIGN="MIDDLE" BORDER="0"
 SRC="img786.gif"
 ALT="$\displaystyle P(w_i/W_i) = \frac{1}{\sqrt{2\pi}B_i} e^{- \frac{(w_i-W_i)^2} {2B_i^2}}$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.76)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
Now, we assume that the set of expected coefficients <I>W</I><SUB><I>i</I></SUB> for a given
scale also follows a Gaussian distribution, with a null mean and a
standard deviation <I>S</I><SUB><I>i</I></SUB>:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
P(W_i) = \frac{1}{\sqrt{2\pi}S_i}e^{-\frac{W_i^2}{2S_i^2}}
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="217" HEIGHT="96" ALIGN="MIDDLE" BORDER="0"
 SRC="img787.gif"
 ALT="$\displaystyle P(W_i) = \frac{1}{\sqrt{2\pi}S_i}e^{-\frac{W_i^2}{2S_i^2}}$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.77)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
The null mean value results from the wavelet property:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
\int_{-\infty}^{+\infty} \psi^*(x) dx = 0
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="179" HEIGHT="73" ALIGN="MIDDLE" BORDER="0"
 SRC="img788.gif"
 ALT="$\displaystyle \int_{-\infty}^{+\infty} \psi^*(x) dx = 0$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.78)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
We want to get an estimate of <I>W</I><SUB><I>i</I></SUB> knowing <I>w</I><SUB><I>i</I></SUB>. Bayes' theorem gives:
<BR>
<DIV ALIGN="CENTER"><A NAME="equa_baye">&#160;</A>
<!-- MATH: \begin{eqnarray}
P(W_i/w_i) = \frac{P(W_i)P(w_i/W_i)}{P(w_i)}
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="290" HEIGHT="74" ALIGN="MIDDLE" BORDER="0"
 SRC="img789.gif"
 ALT="$\displaystyle P(W_i/w_i) = \frac{P(W_i)P(w_i/W_i)}{P(w_i)}$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.79)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
We get:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
P(W_i/w_i) = \frac{1}{\sqrt{2\pi}\beta_i}e^{-\frac{(W_i-\alpha_i
w_i)^2}{2\beta_i^2}}
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="310" HEIGHT="91" ALIGN="MIDDLE" BORDER="0"
 SRC="img790.gif"
 ALT="$\displaystyle P(W_i/w_i) = \frac{1}{\sqrt{2\pi}\beta_i}e^{-\frac{(W_i-\alpha_i
w_i)^2}{2\beta_i^2}}$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.80)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
where:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
\alpha_i = \frac{S_i^2}{S_i^2+B_i^2}
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="137" HEIGHT="77" ALIGN="MIDDLE" BORDER="0"
 SRC="img791.gif"
 ALT="$\displaystyle \alpha_i = \frac{S_i^2}{S_i^2+B_i^2}$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.81)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
the probability  
<!-- MATH: $P(W_i/w_i)$ -->
<I>P</I>(<I>W</I><SUB><I>i</I></SUB>/<I>w</I><SUB><I>i</I></SUB>) follows a Gaussian distribution with a mean:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
m = \alpha_i w_i
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="98" HEIGHT="39" ALIGN="MIDDLE" BORDER="0"
 SRC="img792.gif"
 ALT="$\displaystyle m = \alpha_i w_i$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.82)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
and a variance:
<BR>
<DIV ALIGN="CENTER">

<!-- MATH: \begin{eqnarray}
\beta_i^2 = \frac{S_i^2B_i^2}{S_i^2 + B_i^2}
\end{eqnarray} -->

<TABLE ALIGN="CENTER" CELLPADDING="0" WIDTH="100%">
<TR VALIGN="MIDDLE"><TD NOWRAP ALIGN="RIGHT"><IMG
 WIDTH="140" HEIGHT="77" ALIGN="MIDDLE" BORDER="0"
 SRC="img793.gif"
 ALT="$\displaystyle \beta_i^2 = \frac{S_i^2B_i^2}{S_i^2 + B_i^2}$"></TD>
<TD>&nbsp;</TD>
<TD>&nbsp;</TD>
<TD WIDTH=10 ALIGN="RIGHT">
(14.83)</TD></TR>
</TABLE></DIV>
<BR CLEAR="ALL"><P></P>
The mathematical expectation of <I>W</I><SUB><I>i</I></SUB> is 
<!-- MATH: $\alpha_i w_i$ -->
<IMG
 WIDTH="49" HEIGHT="39" ALIGN="MIDDLE" BORDER="0"
 SRC="img794.gif"
 ALT="$\alpha_i w_i$">.

<P>
With a simple multiplication of the coefficients by the constant <IMG
 WIDTH="26" HEIGHT="39" ALIGN="MIDDLE" BORDER="0"
 SRC="img795.gif"
 ALT="$\alpha_i$">,
we get a linear filter. The algorithm is:
<DL COMPACT>
<DT>1.
<DD>Compute the wavelet transform of the data. We get <I>w</I><SUB><I>i</I></SUB>.
<DT>2.
<DD>Estimate the standard deviation of the noise <I>B</I><SUB>0</SUB> of the first plane
from the histogram of <I>w</I><SUB>0</SUB>. As we process oversampled images, the
values of the wavelet image corresponding to the first scale (<I>w</I><SUB>0</SUB>)
are due mainly to the noise. The histogram shows a Gaussian peak
around 0. We compute the standard deviation of this Gaussian
function, with a <IMG
 WIDTH="30" HEIGHT="21" ALIGN="BOTTOM" BORDER="0"
 SRC="img796.gif"
 ALT="$3\sigma$">
clipping, rejecting pixels where the signal
could be significant;
<DT>3.
<DD>Set i to 0. 
<DT>4.
<DD>Estimate the standard deviation of the noise <I>B</I><SUB><I>i</I></SUB> from <I>B</I><SUB>0</SUB>. This
is done from the study of the variation of the noise between two
scales, with an hypothesis of a white gaussian noise;
<DT>5.
<DD>
<!-- MATH: $S_i^2 = s_i^2 - B_i^2$ -->
<I>S</I><SUB><I>i</I></SUB><SUP>2</SUP> = <I>s</I><SUB><I>i</I></SUB><SUP>2</SUP> - <I>B</I><SUB><I>i</I></SUB><SUP>2</SUP> where <I>s</I><SUB><I>i</I></SUB><SUP>2</SUP> is the variance of <I>w</I><SUB><I>i</I></SUB>.
<DT>6.
<DD>
<!-- MATH: $\alpha_i = \frac{S_i^2}{S_i^2+B_i^2}$ -->
<IMG
 WIDTH="114" HEIGHT="62" ALIGN="MIDDLE" BORDER="0"
 SRC="img797.gif"
 ALT="$\alpha_i = \frac{S_i^2}{S_i^2+B_i^2}$">.
<DT>7.
<DD>
<!-- MATH: $W_i = \alpha_i w_i$ -->
<IMG
 WIDTH="106" HEIGHT="41" ALIGN="MIDDLE" BORDER="0"
 SRC="img798.gif"
 ALT="$W_i = \alpha_i w_i$">.
<DT>8.
<DD><I>i</I> = <I>i</I> + 1 and go to 4.
<DT>9.
<DD>Reconstruct the picture from <I>W</I><SUB><I>i</I></SUB>.
</DL>
<P>
<HR>
<!--Navigation Panel-->
<A NAME="tex2html5567"
 HREF="node331.html">
<IMG WIDTH="37" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="next"
 SRC="icons.gif/next_motif.gif"></A> 
<A NAME="tex2html5564"
 HREF="node328.html">
<IMG WIDTH="26" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="up"
 SRC="icons.gif/up_motif.gif"></A> 
<A NAME="tex2html5558"
 HREF="node329.html">
<IMG WIDTH="63" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="previous"
 SRC="icons.gif/previous_motif.gif"></A> 
<A NAME="tex2html5566"
 HREF="node1.html">
<IMG WIDTH="65" HEIGHT="24" ALIGN="BOTTOM" BORDER="0" ALT="contents"
 SRC="icons.gif/contents_motif.gif"></A>  
<BR>
<B> Next:</B> <A NAME="tex2html5568"
 HREF="node331.html">Hierarchical Wiener filtering</A>
<B> Up:</B> <A NAME="tex2html5565"
 HREF="node328.html">Noise reduction from the</A>
<B> Previous:</B> <A NAME="tex2html5559"
 HREF="node329.html">The convolution from the</A>
<!--End of Navigation Panel-->
<ADDRESS>
<I>Petra Nass</I>
<BR><I>1999-06-15</I>
</ADDRESS>
</BODY>
</HTML>