3D gravity inversion by planting anomalous densities

3D gravity inversion by planting anomalous densities

84d34651c3931a54310a57484a109821?s=128

Leonardo Uieda

August 15, 2011
Tweet

Transcript

  1. 3D gravity inversion by planting anomalous densities Leonardo Uieda and

    Valéria C. F. Barbosa August, 2011 Observatório Nacional
  2. Outline

  3. Forward Problem Outline

  4. Inverse Problem Forward Problem Outline

  5. Inverse Problem Planting Algorithm Forward Problem Inspired by René (1986)

    Outline
  6. Inverse Problem Planting Algorithm Synthetic Data Forward Problem Inspired by

    René (1986) Outline
  7. Inverse Problem Planting Algorithm Synthetic Data Real Data Forward Problem

    Inspired by René (1986) Outline
  8. Forward problem

  9. Surface of the Earth

  10. Observations of g z Surface of the Earth

  11. Observations of g z Group in a vector: g= [g

    1 g 2 ⋮ g N ] N×1 = observed data g Surface of the Earth
  12. = observed data g

  13. = observed data g Assume caused by anomalous sources

  14. = observed data g Assume caused by anomalous sources Δρ

    Density contrast =
  15. Parametrize the gravitational effect

  16. Parametrize the gravitational effect Linearize

  17. Parametrize the gravitational effect Discretize into M elements Linearize Interpretative

    model
  18. Right rectangular prisms Parametrize the gravitational effect Discretize into M

    elements Homogeneous density contrast jth element Linearize (Nagy et al., 2000) Interpretative model p j
  19. Arrange M density contrasts in a vector: Parametrize the gravitational

    effect Discretize into M elements p= [p 1 p 2 ⋮ p M ] M×1 Parameter vector Linearize Interpretative model
  20. Discretize into M elements Parametrize the gravitational effect p j

    =Δρ Prisms with not shown p j =0
  21. Discretize into M elements Parametrize the gravitational effect g≈d Prisms

    with not shown p j =0 p j =Δρ
  22. Discretize into M elements Parametrize the gravitational effect g≈d Predicted

    data Prisms with not shown p j =0 p j =Δρ
  23. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear d=∑ j=1 M p j a j g≈d Predicted data Prisms with not shown p j =0 p j =Δρ
  24. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear d=∑ j=1 M p j a j Density contrast of jth prism g≈d Predicted data Prisms with not shown p j =0 p j =Δρ
  25. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear d=∑ j=1 M p j a j g≈d Predicted data Effect of prism with unit density Prisms with not shown p j =0 p j =Δρ
  26. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear g≈d Predicted data d=∑ j=1 M p j a j =A p Prisms with not shown p j =0 p j =Δρ
  27. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear g≈d Predicted data Parameter vector d=∑ j=1 M p j a j =A p Prisms with not shown p j =0 p j =Δρ
  28. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear g≈d Predicted data Jacobian (sensitivity) matrix d=∑ j=1 M p j a j =A p Prisms with not shown p j =0 p j =Δρ
  29. Discretize into M elements Parametrize the gravitational effect Gravitational effect

    is linear g≈d Predicted data Column vector of A d=∑ j=1 M p j a j =A p Prisms with not shown p j =0 p j =Δρ
  30. Solved forward problem: p d d=∑ j=1 M p j

    a j
  31. ̂ p g ? How to do the inverse?

  32. Inverse problem

  33. Minimize difference between

  34. Minimize difference between g Observed data

  35. Minimize difference between and g d Predicted data

  36. Minimize difference between and g d r=g−d

  37. Minimize difference between and g d r=g−d Residual vector

  38. Minimize difference between and g d r=g−d Residual vector Data­misfit

    function: ϕ( p)=∥r∥2 = (∑ i=1 N (g i −d i )2 )1 2
  39. Minimize difference between and g d r=g−d Residual vector Data­misfit

    function: ϕ( p)=∥r∥2 = (∑ i=1 N (g i −d i )2 )1 2 ℓ2­norm of r Least­squares fit
  40. ill­posed problem non­existent non­unique non­stable

  41. ill­posed problem non­existent non­unique non­stable constraints

  42. ill­posed problem non­existent non­unique non­stable well­posed problem exist unique stable

    constraints
  43. Constraints: 1. Compact

  44. Constraints: 1. Compact no holes inside

  45. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

  46. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

    Similar to René (1986)
  47. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

    • User­specified prisms Similar to René (1986)
  48. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

    • User­specified prisms • Given density contrasts ρs Similar to René (1986)
  49. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

    • User­specified prisms • Given density contrasts • Any n° of ≠ density contrasts ρs Similar to René (1986) Not like René (1986)
  50. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

    • User­specified prisms • Given density contrasts 3. Only • Any n° of ≠ density contrasts or p j =0 p j =ρs ρs Similar to René (1986) Not like René (1986)
  51. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”

    • User­specified prisms • Given density contrasts 3. Only • Any n° of ≠ density contrasts or p j =0 p j =ρs ρs 4. of closest seed p j =ρs Similar to René (1986) Not like René (1986)
  52. ill­posed problem well­posed problem constraints ϕ( p)=∥r∥2 = (∑ i=1

    N (g i −d i )2 )1 2 Minimize data misfit Minimize goal function Γ( p)=ϕ( p)+μθ( p)
  53. ill­posed problem well­posed problem constraints ϕ( p)=∥r∥2 = (∑ i=1

    N (g i −d i )2 )1 2 Minimize data misfit Minimize goal function Γ( p)=ϕ( p)+μθ( p) Regularizing parameter
  54. ill­posed problem well­posed problem constraints ϕ( p)=∥r∥2 = (∑ i=1

    N (g i −d i )2 )1 2 Minimize data misfit Minimize goal function Γ( p)=ϕ( p)+μθ( p) Regularizing function
  55. ill­posed problem well­posed problem constraints ϕ( p)=∥r∥2 = (∑ i=1

    N (g i −d i )2 )1 2 Minimize data misfit Minimize goal function Γ( p)=ϕ( p)+μθ( p) Regularizing function μ = tradeoff between fit and regularization
  56. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p)
  57. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p) Similar to Silva Dias et al. (2009)
  58. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p) ϵ = avoid singularity l j = distance between jth prism and seed β = how much compactness (3 to 7) Similar to Silva Dias et al. (2009)
  59. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p) For p j ≠0:
  60. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p) distance from seeds For p j ≠0:
  61. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p) distance from seeds regularizing function For p j ≠0:
  62. Regularization: θ( p)=∑ j=1 M p j p j +ϵ

    l j β Γ( p)=ϕ( p)+μθ( p) distance from seeds regularizing function Imposes: • Compactness • Concentration around seeds For p j ≠0:
  63. Constraints: 1. Compact 2. Concentrated around “seeds” 3. Only or

    p j =0 p j =Δρs 4. of closest seed p j =Δρs Regularization
  64. Constraints: 1. Compact 2. Concentrated around “seeds” 3. Only or

    p j =0 p j =Δρs 4. of closest seed p j =Δρs Regularization Algorithm
  65. Planting Algorithm

  66. Based on René (1986) Overview:

  67. Based on René (1986) Start with seeds Overview:

  68. Based on René (1986) Start with seeds Overview:

  69. Based on René (1986) Start with seeds known density contrast

    & position Overview:
  70. Based on René (1986) Start with seeds known density contrast

    & position All other parameters set to 0 Overview:
  71. Based on René (1986) Start with seeds All other parameters

    set to 0 Iteratively grow Overview: known density contrast & position
  72. Based on René (1986) Start with seeds All other parameters

    set to 0 Iteratively grow add neighbor of seed Overview: known density contrast & position
  73. Based on René (1986) Start with seeds All other parameters

    set to 0 Iteratively grow add neighbor of seed accretion Overview: known density contrast & position
  74. Based on René (1986) Start with seeds All other parameters

    set to 0 Iteratively grow add neighbor of seed accretion Controlled by goal function and data misfit function Overview: Γ( p)=ϕ( p)+μθ( p) ϕ( p)=∥r∥2 = (∑ i=1 N (g i −d i )2 )1 2 known density contrast & position
  75. Algorithm:

  76. Algorithm: Define interpretative model Interpretative model

  77. Algorithm: Define interpretative model g = observed data Interpretative model

  78. Algorithm: Define interpretative model All parameters zero g = observed

    data Interpretative model
  79. Algorithm: seeds N S Define interpretative model All parameters zero

    g = observed data Interpretative model
  80. Algorithm: seeds N S Define interpretative model All parameters zero

    Include seeds Prisms with not shown p j =0 Seeds
  81. Algorithm: seeds N S Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g−d(0) Prisms with not shown p j =0
  82. Algorithm: Residual vector seeds N S Define interpretative model All

    parameters zero Include seeds Compute initial residuals r(0)=g−d(0) Prisms with not shown p j =0
  83. Algorithm: Observed data seeds N S Define interpretative model All

    parameters zero Include seeds Compute initial residuals r(0)=g−d(0) g = observed data Prisms with not shown p j =0
  84. Algorithm: Predicted by seeds seeds N S Define interpretative model

    All parameters zero Include seeds Compute initial residuals r(0)=g−d(0) g = observed data d = predicted data Prisms with not shown p j =0
  85. Algorithm: seeds N S Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g−d(0) g = observed data d=∑ j=0 M p j a j d = predicted data Prisms with not shown p j =0
  86. Algorithm: seeds N S Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g−d(0) g = observed data d=∑ j=0 M p j a j Many=0 d = predicted data Prisms with not shown p j =0
  87. Algorithm: seeds N S Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g−d(0) g = observed data d=∑ s=1 N S ρ s a j S d = predicted data Prisms with not shown p j =0
  88. Algorithm: seeds N S Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g− (∑ s=1 N S ρ s a j S ) Prisms with not shown g = observed data d = predicted data p j =0
  89. Algorithm: Density contrast of sth seed seeds N S Define

    interpretative model All parameters zero Include seeds Compute initial residuals r(0)=g− (∑ s=1 N S ρ s a j S ) Prisms with not shown g = observed data d = predicted data p j =0
  90. seeds N S Algorithm: Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g− (∑ s=1 N S ρ s a j S ) Column vector of A Prisms with not shown g = observed data d = predicted data p j =0
  91. seeds N S Algorithm: Define interpretative model All parameters zero

    Include seeds Compute initial residuals r(0)=g− (∑ s=1 N S ρ s a j S ) Prisms with not shown g = observed data d = predicted data Neighbors Find neighbors of seeds p j =0
  92. Prisms with not shown Growth: p j =0

  93. Prisms with not shown Growth: Try accretion to sth seed:

    p j =0
  94. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: p j =0
  95. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit ϕ( p)=∥r∥ 2 p j =0
  96. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =0
  97. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) j = chosen j p j =0
  98. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen j p j =0 (New elements)
  99. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen j p j =0 (New elements) new predicted data
  100. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals r(new)=g−d(new) p j =0 (New elements) new predicted data j
  101. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals r(new)=g−d(new) p j =0 (New elements) new predicted data j d(old)+ effect of j
  102. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals r(new)=g−d(new) p j =0 (New elements) new predicted data j d(old)+ effect of j ∑ s=1 N S ρs a j S p j a j +
  103. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals r(new)=g−d(new) p j =0 (New elements) new predicted data j d(old)+ effect of j ∑ s=1 N S ρs a j S p j a j +
  104. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals r(new)=g−∑ s=1 N S ρs a j S − p j a j p j =0 (New elements) new predicted data j
  105. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals r(new)=g−∑ s=1 N S ρs a j S − p j a j p j =0 (New elements) new predicted data j { r(0)
  106. Prisms with not shown Growth: Try accretion to sth seed:

    Choose neighbor: 1. Reduce data misfit 2. Smallest goal function ϕ( p)=∥r∥ 2 Γ( p)=ϕ( p)+μθ( p) p j =ρ s j = chosen Update residuals p j =0 (New elements) new predicted data j r(new)=r(old )− p j a j
  107. Prisms with not shown Growth: None found = no accretion

    Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: p j =0
  108. Prisms with not shown Growth: None found = no accretion

    Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: Variable sizes p j =0
  109. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: p j =0
  110. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: p j =0 (New elements) j
  111. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0
  112. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0
  113. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 (New elements) j
  114. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 (New elements) j
  115. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 j
  116. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 j
  117. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 j
  118. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 j
  119. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 j
  120. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0 j
  121. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No p j =0
  122. Prisms with not shown Growth: None found = no accretion

    N S Try accretion to sth seed: 1. Reduce data misfit 2. Smallest goal function p j =ρ s j = chosen Update residuals r(new)=r(old )− p j a j Choose neighbor: At least one seed grow? Yes No Done! p j =0
  123. Advantages: Compact & non­smooth Any number of sources Any number

    of different density contrasts No large equation system Search limited to neighbors
  124. Remember equations: r(0)=g− (∑ s=1 N S ρs a j

    S ) r(new)=r(old)− p j a j Initial residual Update residual vector
  125. Remember equations: r(0)=g− (∑ s=1 N S ρ s a

    j S ) r(new)=r(old)− p j a j Initial residual Update residual vector No matrix multiplication (only vector +)
  126. No matrix multiplication (only vector +) Remember equations: r(0)=g− (∑

    s=1 N S ρ s a j S ) r(new)=r(old)− p j a j Initial residual Update residual vector Only need some columns of A
  127. No matrix multiplication (only vector +) Remember equations: r(0)=g− (∑

    s=1 N S ρ s a j S ) r(new)=r(old)− p j a j Initial residual Update residual vector Only need some columns of A Calculate only when needed
  128. No matrix multiplication (only vector +) Remember equations: r(0)=g− (∑

    s=1 N S ρ s a j S ) r(new)=r(old)− p j a j Initial residual Update residual vector Only need some columns of A Calculate only when needed & delete after update
  129. No matrix multiplication (only vector +) Remember equations: r(0)=g− (∑

    s=1 N S ρ s a j S ) r(new)=r(old)− p j a j Initial residual Update residual vector Only need some columns of A Calculate only when needed Lazy evaluation & delete after update
  130. Advantages: Compact & non­smooth Any number of sources Any number

    of different density contrasts No large equation system Search limited to neighbors
  131. Advantages: Compact & non­smooth Any number of sources Any number

    of different density contrasts No large equation system Search limited to neighbors No matrix multiplication (only vector +) Lazy evaluation of Jacobian
  132. Advantages: Compact & non­smooth Any number of sources Any number

    of different density contrasts No large equation system Search limited to neighbors No matrix multiplication (only vector +) Lazy evaluation of Jacobian Fast inversion + low memory usage
  133. Synthetic Data

  134. None
  135. • Sources = 1 km X 1 km X 1

    km
  136. Δρ=0.5 g/cm3 • Sources = 1 km X 1 km

    X 1 km
  137. Δρ=1.0 g/cm3 Δρ=0.5 g/cm3 • Sources = 1 km X

    1 km X 1 km
  138. • Sources = 1 km X 1 km X 1

    km Depth=0.8 km
  139. • Sources = 1 km X 1 km X 1

    km Depth=1.6 km Depth=0.8 km
  140. • Sources = 1 km X 1 km X 1

    km • Data set = 375 observations Depth=1.6 km Depth=0.8 km
  141. • Sources = 1 km X 1 km X 1

    km • Area = 5 km X 3 km • Data set = 375 observations Depth=1.6 km Depth=0.8 km
  142. • 0.05 mGal Gaussian noise • Sources = 1 km

    X 1 km X 1 km • Area = 5 km X 3 km • Data set = 375 observations Depth=1.6 km Depth=0.8 km
  143. • Interpretative model = 151,875 prisms • Prisms = 66.7

    m X 66.7 m X 66.7 m
  144. • Used 2 seeds

  145. • Used 2 seeds • Placed in center of sources

  146. • Used 2 seeds • With corresponding density contrasts •

    Placed in center of sources
  147. Δρ=0.5 g/cm3 • Used 2 seeds • With corresponding density

    contrasts • Placed in center of sources
  148. Δρ=1.0 g/cm3 Δρ=0.5 g/cm3 • Used 2 seeds • With

    corresponding density contrasts • Placed in center of sources
  149. Inversion result

  150. Inversion result

  151. Inversion result • compact • concentrated around seeds • recover

    correct geometry of sources
  152. Predicted data Inversion result • compact • concentrated around seeds

    • recover correct geometry of sources
  153. Predicted data Observed data Inversion result • compact • concentrated

    around seeds • recover correct geometry of sources
  154. Predicted data Observed data Inversion result • compact • concentrated

    around seeds • fits observations • recover correct geometry of sources
  155. Predicted data Observed data On laptop with 2.0 GHz •

    375 data • 151,875 prisms • Total time≈4.4 min
  156. Real Data

  157. After Carminatti et al. (2003)

  158. Cana Brava complex (CBC) After Carminatti et al. (2003)

  159. Cana Brava complex (CBC) & Palmeirópolis sequence (PVSS) After Carminatti

    et al. (2003)
  160. Cana Brava complex (CBC) & Palmeirópolis sequence (PVSS) • Outcropping

    • North of Goiás • Tocantins Province • Amazonian & São Francisco cratons After Carminatti et al. (2003)
  161. Cana Brava complex (CBC) & Palmeirópolis sequence (PVSS) Gravimetric data:

    • 132 observations • Residual Bouguer • Max 45 mGal After Carminatti et al. (2003)
  162. Cana Brava complex (CBC) & Palmeirópolis sequence (PVSS) Previous interpretation:

    After Carminatti et al. (2003) • Carminatti et al. (2003) • PVSS • CVC • Max depth Δρ=0.27 g/cm3 Δρ=0.39g/cm3 ≈6 km
  163. Cana Brava complex (CBC) & Palmeirópolis sequence (PVSS) Previous interpretation:

    After Carminatti et al. (2003) • Carminatti et al. (2003) • PVSS • CVC • Max depth Δρ=0.27 g/cm3 Δρ=0.39g/cm3 ≈6 km Test this hypothesis
  164. Assign seeds Green: z=0 km Δρ=0.27g/cm3 Blue: z=2 km Δρ=0.27g/cm3

    Red: z=0 km Δρ=0.39 g/cm3 Total = 269 Assign seeds
  165. Interpretative model Size: 120 km X 50 km X 11

    km 480,000 prisms Prism size: 500 m X 500 m X 575 m
  166. Inversion result

  167. Inversion result

  168. Inversion result Predicted data Observed data

  169. Inversion result

  170. Inversion result

  171. Inversion result

  172. Inversion result

  173. Inversion result

  174. Inversion result

  175. Inversion result ≈6km Max depth Agree with previous interpretation Compact

    Fits observations
  176. Inversion result On laptop with 2.0 GHz • 132 data

    • 480,00 prisms • Total time≈3.75 min
  177. Conclusions

  178. • New 3D gravity inversion • Multiple sources • Interfering

    gravitational effects • Abrupt density­contrast distribution • No matrix multiplication • No need to solve large linear systems • Ideal for: ore bodies, intrusions, salt domes, etc Conclusions
  179. • Developed for gravity gradients • Presented at EAGE 2011

    preliminary results • To be presented at SEG 2011: • Final results • Robust method to handle non­targeted sources Previous and future work
  180. Thank you