982 resultados para Semi-implicit methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis two major topics inherent with medical ultrasound images are addressed: deconvolution and segmentation. In the first case a deconvolution algorithm is described allowing statistically consistent maximum a posteriori estimates of the tissue reflectivity to be restored. These estimates are proven to provide a reliable source of information for achieving an accurate characterization of biological tissues through the ultrasound echo. The second topic involves the definition of a semi automatic algorithm for myocardium segmentation in 2D echocardiographic images. The results show that the proposed method can reduce inter- and intra observer variability in myocardial contours delineation and is feasible and accurate even on clinical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The biogenic production of NO in the soil accounts for between 10% and 40% of the global total. A large degree of the uncertainty in the estimation of the biogenic emissions stems from a shortage of measurements in arid regions, which comprise 40% of the earth’s land surface area. This study examined the emission of NO from three ecosystems in southern Africa which cover an aridity gradient from semi-arid savannas in South Africa to the hyper-arid Namib Desert in Namibia. A laboratory method was used to determine the release of NO as a function of the soil moisture and the soil temperature. Various methods were used to up-scale the net potential NO emissions determined in the laboratory to the vegetation patch, landscape or regional level. The importance of landscape, vegetation and climatic characteristics is emphasized. The first study occurred in a semi-arid savanna region in South Africa, where soils were sampled from 4 landscape positions in the Kruger National Park. The maximum NO emission occurred at soil moisture contents of 10%-20% water filled pore space (WFPS). The highest net potential NO emissions came from the low lying landscape positions, which have the largest nitrogen (N) stocks and the largest input of N. Net potential NO fluxes obtained in the laboratory were converted in field fluxes for the period 2003-2005, for the four landscape positions, using soil moisture and temperature data obtained in situ at the Kruger National Park Flux Tower Site. The NO emissions ranged from 1.5-8.5 kg ha-1 a-1. The field fluxes were up-scaled to a regional basis using geographic information system (GIS) based techniques, this indicated that the highest NO emissions occurred from the Midslope positions due to their large geographical extent in the research area. Total emissions ranged from 20x103 kg in 2004 to 34x103 kg in 2003 for the 56000 ha Skukuza land type. The second study occurred in an arid savanna ecosystem in the Kalahari, Botswana. In this study I collected soils from four differing vegetation patch types including: Pan, Annual Grassland, Perennial Grassland and Bush Encroached patches. The maximum net potential NO fluxes ranged from 0.27 ng m-2 s-1 in the Pan patches to 2.95 ng m-2 s-1 in the Perennial Grassland patches. The net potential NO emissions were up-scaled for the year December 2005-November 2006. This was done using 1) the net potential NO emissions determined in the laboratory, 2) the vegetation patch distribution obtained from LANDSAT NDVI measurements 3) estimated soil moisture contents obtained from ENVISAT ASAR measurements and 4) soil surface temperature measurements using MODIS 8 day land surface temperature measurements. This up-scaling procedure gave NO fluxes which ranged from 1.8 g ha-1 month-1 in the winter months (June and July) to 323 g ha-1 month-1 in the summer months (January-March). Differences occurred between the vegetation patches where the highest NO fluxes occurred in the Perennial Grassland patches and the lowest in the Pan patches. Over the course of the year the mean up-scaled NO emission for the studied region was 0.54 kg ha-1 a-1 and accounts for a loss of approximately 7.4% of the estimated N input to the region. The third study occurred in the hyper-arid Namib Desert in Namibia. Soils were sampled from three ecosystems; Dunes, Gravel Plains and the Riparian zone of the Kuiseb River. The net potential NO flux measured in the laboratory was used to estimate the NO flux for the Namib Desert for 2006 using modelled soil moisture and temperature data from the European Centre for Medium Range Weather Forecasts (ECMWF) operational model on a 36km x 35km spatial resolution. The maximum net potential NO production occurred at low soil moisture contents (<10%WFPS) and the optimal temperature was 25°C in the Dune and Riparian ecosystems and 35°C in the Gravel Plain Ecosystems. The maximum net potential NO fluxes ranged from 3.0 ng m-2 s-1 in the Riparian ecosystem to 6.2 ng m-2 s-1 in the Gravel Plains ecosystem. Up-scaling the net potential NO flux gave NO fluxes of up to 0.062 kg ha-1 a-1 in the Dune ecosystem and 0.544 kg h-1 a-1 in the Gravel Plain ecosystem. From these studies it is shown that NO is emitted ubiquitously from terrestrial ecosystems, as such the NO emission potential from deserts and scrublands should be taken into account in the global NO models. The emission of NO is influenced by various factors such as landscape, vegetation and climate. This study looks at the potential emissions from certain arid and semi-arid environments in southern Africa and other parts of the world and discusses some of the important factors controlling the emission of NO from the soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of guided ultrasonic waves (GUW) has increased considerably in the fields of non-destructive (NDE) testing and structural health monitoring (SHM) due to their ability to perform long range inspections, to probe hidden areas as well as to provide a complete monitoring of the entire waveguide. Guided waves can be fully exploited only once their dispersive properties are known for the given waveguide. In this context, well stated analytical and numerical methods are represented by the Matrix family methods and the Semi Analytical Finite Element (SAFE) methods. However, while the former are limited to simple geometries of finite or infinite extent, the latter can model arbitrary cross-section waveguides of finite domain only. This thesis is aimed at developing three different numerical methods for modelling wave propagation in complex translational invariant systems. First, a classical SAFE formulation for viscoelastic waveguides is extended to account for a three dimensional translational invariant static prestress state. The effect of prestress, residual stress and applied loads on the dispersion properties of the guided waves is shown. Next, a two-and-a-half Boundary Element Method (2.5D BEM) for the dispersion analysis of damped guided waves in waveguides and cavities of arbitrary cross-section is proposed. The attenuation dispersive spectrum due to material damping and geometrical spreading of cavities with arbitrary shape is shown for the first time. Finally, a coupled SAFE-2.5D BEM framework is developed to study the dispersion characteristics of waves in viscoelastic waveguides of arbitrary geometry embedded in infinite solid or liquid media. Dispersion of leaky and non-leaky guided waves in terms of speed and attenuation, as well as the radiated wavefields, can be computed. The results obtained in this thesis can be helpful for the design of both actuation and sensing systems in practical application, as well as to tune experimental setup.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Flachwassergleichungen (SWE) sind ein hyperbolisches System von Bilanzgleichungen, die adäquate Approximationen an groß-skalige Strömungen der Ozeane, Flüsse und der Atmosphäre liefern. Dabei werden Masse und Impuls erhalten. Wir unterscheiden zwei charakteristische Geschwindigkeiten: die Advektionsgeschwindigkeit, d.h. die Geschwindigkeit des Massentransports, und die Geschwindigkeit von Schwerewellen, d.h. die Geschwindigkeit der Oberflächenwellen, die Energie und Impuls tragen. Die Froude-Zahl ist eine Kennzahl und ist durch das Verhältnis der Referenzadvektionsgeschwindigkeit zu der Referenzgeschwindigkeit der Schwerewellen gegeben. Für die oben genannten Anwendungen ist sie typischerweise sehr klein, z.B. 0.01. Zeit-explizite Finite-Volume-Verfahren werden am öftersten zur numerischen Berechnung hyperbolischer Bilanzgleichungen benutzt. Daher muss die CFL-Stabilitätsbedingung eingehalten werden und das Zeitinkrement ist ungefähr proportional zu der Froude-Zahl. Deswegen entsteht bei kleinen Froude-Zahlen, etwa kleiner als 0.2, ein hoher Rechenaufwand. Ferner sind die numerischen Lösungen dissipativ. Es ist allgemein bekannt, dass die Lösungen der SWE gegen die Lösungen der Seegleichungen/ Froude-Zahl Null SWE für Froude-Zahl gegen Null konvergieren, falls adäquate Bedingungen erfüllt sind. In diesem Grenzwertprozess ändern die Gleichungen ihren Typ von hyperbolisch zu hyperbolisch.-elliptisch. Ferner kann bei kleinen Froude-Zahlen die Konvergenzordnung sinken oder das numerische Verfahren zusammenbrechen. Insbesondere wurde bei zeit-expliziten Verfahren falsches asymptotisches Verhalten (bzgl. der Froude-Zahl) beobachtet, das diese Effekte verursachen könnte.Ozeanographische und atmosphärische Strömungen sind typischerweise kleine Störungen eines unterliegenden Equilibriumzustandes. Wir möchten, dass numerische Verfahren für Bilanzgleichungen gewisse Equilibriumzustände exakt erhalten, sonst können künstliche Strömungen vom Verfahren erzeugt werden. Daher ist die Quelltermapproximation essentiell. Numerische Verfahren die Equilibriumzustände erhalten heißen ausbalanciert.rnrnIn der vorliegenden Arbeit spalten wir die SWE in einen steifen, linearen und einen nicht-steifen Teil, um die starke Einschränkung der Zeitschritte durch die CFL-Bedingung zu umgehen. Der steife Teil wird implizit und der nicht-steife explizit approximiert. Dazu verwenden wir IMEX (implicit-explicit) Runge-Kutta und IMEX Mehrschritt-Zeitdiskretisierungen. Die Raumdiskretisierung erfolgt mittels der Finite-Volumen-Methode. Der steife Teil wird mit Hilfe von finiter Differenzen oder au eine acht mehrdimensional Art und Weise approximniert. Zur mehrdimensionalen Approximation verwenden wir approximative Evolutionsoperatoren, die alle unendlich viele Informationsausbreitungsrichtungen berücksichtigen. Die expliziten Terme werden mit gewöhnlichen numerischen Flüssen approximiert. Daher erhalten wir eine Stabilitätsbedingung analog zu einer rein advektiven Strömung, d.h. das Zeitinkrement vergrößert um den Faktor Kehrwert der Froude-Zahl. Die in dieser Arbeit hergeleiteten Verfahren sind asymptotisch erhaltend und ausbalanciert. Die asymptotischer Erhaltung stellt sicher, dass numerische Lösung das &amp;amp;quot;korrekte&amp;amp;quot; asymptotische Verhalten bezüglich kleiner Froude-Zahlen besitzt. Wir präsentieren Verfahren erster und zweiter Ordnung. Numerische Resultate bestätigen die Konvergenzordnung, so wie Stabilität, Ausbalanciertheit und die asymptotische Erhaltung. Insbesondere beobachten wir bei machen Verfahren, dass die Konvergenzordnung fast unabhängig von der Froude-Zahl ist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Für das Vermögen der Atmosphäre sich selbst zu reinigen spielen Stickstoffmonoxid (NO) und Stickstoffdioxid (NO2) eine bedeutende Rolle. Diese Spurengase bestimmen die photochemische Produktion von Ozon (O3) und beeinflussen das Vorkommen von Hydroxyl- (OH) und Nitrat-Radikalen (NO3). Wenn tagsüber ausreichend Solarstrahlung und Ozon vorherrschen, stehen NO und NO2 in einem schnellen photochemischen Gleichgewicht, dem „Photostationären Gleichgewichtszustand“ (engl.: photostationary state). Die Summe von NO und NO2 wird deshalb als NOx zusammengefasst. Vorhergehende Studien zum photostationären Gleichgewichtszustand von NOx umfassen Messungen an unterschiedlichsten Orten, angefangen bei Städten (geprägt von starken Luftverschmutzungen), bis hin zu abgeschiedenen Regionen (geprägt von geringeren Luftverschmutzungen). Während der photochemische Kreislauf von NO und NO2 unter Bedingungen erhöhter NOx-Konzentrationen grundlegend verstanden ist, gibt es in ländlicheren und entlegenen Regionen, welche geprägt sind von niedrigeren NOx-Konzetrationen, signifikante Lücken im Verständnis der zugrundeliegenden Zyklierungsprozesse. Diese Lücken könnten durch messtechnische NO2-Interferenzen bedingt sein - insbesondere bei indirekten Nachweismethoden, welche von Artefakten beeinflusst sein können. Bei sehr niedrigen NOx-Konzentrationen und wenn messtechnische NO2-Interferenzen ausgeschlossen werden können, wird häufig geschlussfolgert, dass diese Verständnislücken mit der Existenz eines „unbekannten Oxidationsmittels“ (engl.: unknown oxidant) verknüpft ist. Im Rahmen dieser Arbeit wird der photostationäre Gleichgewichtszustand von NOx analysiert, mit dem Ziel die potenzielle Existenz bislang unbekannter Prozesse zu untersuchen. Ein Gasanalysator für die direkte Messung von atmosphärischem NO¬2 mittels laserinduzierter Fluoreszenzmesstechnik (engl. LIF – laser induced fluorescence), GANDALF, wurde neu entwickelt und während der Messkampagne PARADE 2011 erstmals für Feldmessungen eingesetzt. Die Messungen im Rahmen von PARADE wurden im Sommer 2011 in einem ländlich geprägten Gebiet in Deutschland durchgeführt. Umfangreiche NO2-Messungen unter Verwendung unterschiedlicher Messtechniken (DOAS, CLD und CRD) ermöglichten einen ausführlichen und erfolgreichen Vergleich von GANDALF mit den übrigen NO2-Messtechniken. Weitere relevante Spurengase und meteorologische Parameter wurden gemessen, um den photostationären Zustand von NOx, basierend auf den NO2-Messungen mit GANDALF in dieser Umgebung zu untersuchen. Während PARADE wurden moderate NOx Mischungsverhältnisse an der Messstelle beobachtet (10^2 - 10^4 pptv). Mischungsverhältnisse biogener flüchtige Kohlenwasserstoffverbindungen (BVOC, engl.: biogenic volatile organic compounds) aus dem umgebenden Wald (hauptsächlich Nadelwald) lagen in der Größenordnung 10^2 pptv vor. Die Charakteristiken des photostationären Gleichgewichtszustandes von NOx bei niedrigen NOx-Mischungsverhältnissen (10 - 10^3 pptv) wurde für eine weitere Messstelle in einem borealen Waldgebiet während der Messkampagne HUMPPA-COPEC 2010 untersucht. HUMPPA–COPEC–2010 wurde im Sommer 2010 in der SMEARII-Station in Hyytiälä, Süd-Finnland, durchgeführt. Die charakteristischen Eigenschaften des photostationären Gleichgewichtszustandes von NOx in den beiden Waldgebieten werden in dieser Arbeit verglichen. Des Weiteren ermöglicht der umfangreiche Datensatz - dieser beinhaltet Messungen von relevanten Spurengasen für die Radikalchemie (OH, HO2), sowie der totalen OH-Reaktivität – das aktuelle Verständnis bezüglich der NOx-Photochemie unter Verwendung von einem Boxmodell, in welches die gemessenen Daten als Randbedingungen eingehen, zu überprüfen und zu verbessern. Während NOx-Konzentrationen in HUMPPA-COPEC 2010 niedriger sind, im Vergleich zu PARADE 2011 und BVOC-Konzentrationen höher, sind die Zyklierungsprozesse von NO und NO2 in beiden Fällen grundlegend verstanden. Die Analyse des photostationären Gleichgewichtszustandes von NOx für die beiden stark unterschiedlichen Messstandorte zeigt auf, dass potenziell unbekannte Prozesse in keinem der beiden Fälle vorhanden sind. Die aktuelle Darstellung der NOx-Chemie wurde für HUMPPA-COPEC 2010 unter Verwendung des chemischen Mechanismus MIM3* simuliert. Die Ergebnisse der Simulation sind konsistent mit den Berechnungen basierend auf dem photostationären Gleichgewichtszustand von NOx.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Scilla rock avalanche occurred on 6 February 1783 along the coast of the Calabria region (southern Italy), close to the Messina Strait. It was triggered by a mainshock of the Terremoto delle Calabrie seismic sequence, and it induced a tsunami wave responsible for more than 1500 casualties along the neighboring Marina Grande beach. The main goal of this work is the application of semi-analtycal and numerical models to simulate this event. The first one is a MATLAB code expressly created for this work that solves the equations of motion for sliding particles on a two-dimensional surface through a fourth-order Runge-Kutta method. The second one is a code developed by the Tsunami Research Team of the Department of Physics and Astronomy (DIFA) of the Bologna University that describes a slide as a chain of blocks able to interact while sliding down over a slope and adopts a Lagrangian point of view. A wide description of landslide phenomena and in particular of landslides induced by earthquakes and with tsunamigenic potential is proposed in the first part of the work. Subsequently, the physical and mathematical background is presented; in particular, a detailed study on derivatives discratization is provided. Later on, a description of the dynamics of a point-mass sliding on a surface is proposed together with several applications of numerical and analytical models over ideal topographies. In the last part, the dynamics of points sliding on a surface and interacting with each other is proposed. Similarly, different application on an ideal topography are shown. Finally, the applications on the 1783 Scilla event are shown and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we use morphological and numerical methods to test the hypothesis that seasonally formed fracture patterns in the Martian polar regions result from the brittle failure of seasonal CO2 slab ice. The observations by the High Resolution Imaging Science Experiment (HiRISE) of polar regions of Mars show very narrow dark elongated linear patterns that are observed during some periods of time in spring, disappear in summer and re-appear again in the following spring. They are repeatedly formed in the same areas but they do not repeat the exact pattern from year to year. This leads to the conclusion that they are cracks formed in the seasonal ice layer. Some of models of seasonal surface processes rely on the existence of a transparent form of CO2 ice, so-called slab ice. For the creation of the observed cracks the ice is required to be a continuous media, not an agglomeration of relatively separate particles like a firn. The best explanation for our observations is a slab ice with relatively high transparency in the visible wavelength range. This transparency allows a solid state green-house effect to act underneath the ice sheet raising the pressure by sublimation from below. The trapped gas creates overpressure and the ice sheet breaks at some point creating the observed cracks. We show that the times when the cracks appear are in agreement with the model calculation, providing one more piece of evidence that CO2 slab ice covers polar areas in spring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this in vitro study was to assess the agreement among four techniques used as gold standard for the validation of methods for occlusal caries detection. Sixty-five human permanent molars were selected and one site in each occlusal surface was chosen as the test site. The teeth were cut and prepared according to each technique: stereomicroscopy without coloring (1), dye enhancement with rhodamine B (2) and fuchsine/acetic light green (3), and semi-quantitative microradiography (4). Digital photographs from each prepared tooth were assessed by three examiners for caries extension. Weighted kappa, as well as Friedman's test with multiple comparisons, was performed to compare all techniques and verify statistical significant differences. Results: kappa values varied from 0.62 to 0.78, the latter being found by both dye enhancement methods. Friedman's test showed statistical significant difference (P < 0.001) and multiple comparison identified these differences among all techniques, except between both dye enhancement methods (rhodamine B and fuchsine/acetic light green). Cross-tabulation showed that the stereomicroscopy overscored the lesions. Both dye enhancement methods showed a good agreement, while stereomicroscopy overscored the lesions. Furthermore, the outcome of caries diagnostic tests may be influenced by the validation method applied. Dye enhancement methods seem to be reliable as gold standard methods.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Malani and Neilsen (1992) we have proposed alternative estimates of survival function (for time to disease) using a simple marker that describes time to some intermediate stage in a disease process. In this paper we derive the asymptotic variance of one such proposed estimator using two different methods and compare terms of order 1/n when there is no censoring. In the absence of censoring the asymptotic variance obtained using the Greenwood type approach converges to exact variance up to terms involving 1/n. But the asymptotic variance obtained using the theory of the counting process and results from Voelkel and Crowley (1984) on semi-Markov processes has a different term of order 1/n. It is not clear to us at this point why the variance formulae using the latter approach give different results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Visual hallucinations are under-reported by patients and are often undiscovered by health professionals. There is no gold standard available to assess hallucinations. Our objective was to develop a reliable, valid, semi-structured interview for identifying and assessing visual hallucinations in older people with eye disease and cognitive impairment. METHODS: We piloted the North-East Visual Hallucinations Interview (NEVHI) in 80 older people with visual and/or cognitive impairment (patient group) and 34 older people without known risks of hallucinations (control group). The informants of 11 patients were interviewed separately. We established face validity, content validity, criterion validity, inter-rater agreement and the internal consistency of the NEVHI, and assessed the factor structure for questions evaluating emotions, cognitions, and behaviours associated with hallucinations. RESULTS: Recurrent visual hallucinations were common in the patient group (68.8%) and absent in controls (0%). The criterion, face and content validities were good and the internal consistency of screening questions for hallucinations was high (Cronbach alpha: 0.71). The inter-rater agreements for simple and complex hallucinations were good (Kappa 0.72 and 0.83, respectively). Four factors associated with experiencing hallucinations (perceived control, pleasantness, distress and awareness) were identified and explained a total variance of 73%. Informants gave more 'don't know answers' than patients throughout the interview (p = 0.008), especially to questions evaluating cognitions and emotions associated with hallucinations (p = 0.02). CONCLUSIONS: NEVHI is a comprehensive assessment tool, helpful to identify the presence of visual hallucinations and to quantify cognitions, emotions and behaviours associated with hallucinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To develop a novel application of a tool for semi-automatic volume segmentation and adapt it for analysis of fetal cardiac cavities and vessels from heart volume datasets. METHODS: We studied retrospectively virtual cardiac volume cycles obtained with spatiotemporal image correlation (STIC) from six fetuses with postnatally confirmed diagnoses: four with normal hearts between 19 and 29 completed gestational weeks, one with d-transposition of the great arteries and one with hypoplastic left heart syndrome. The volumes were analyzed offline using a commercially available segmentation algorithm designed for ovarian folliculometry. Using this software, individual 'cavities' in a static volume are selected and assigned individual colors in cross-sections and in 3D-rendered views, and their dimensions (diameters and volumes) can be calculated. RESULTS: Individual segments of fetal cardiac cavities could be separated, adjacent segments merged and the resulting electronic casts studied in their spatial context. Volume measurements could also be performed. Exemplary images and interactive videoclips showing the segmented digital casts were generated. CONCLUSION: The approach presented here is an important step towards an automated fetal volume echocardiogram. It has the potential both to help in obtaining a correct structural diagnosis, and to generate exemplary visual displays of cardiac anatomy in normal and structurally abnormal cases for consultation and teaching.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Famines are often linked to drought in semi-arid areas of Sub-Saharan Africa where not only pastoralists, but also increasingly agro-pastoralists are affected. This study addresses the interplay between drought and famine in the rural semi-arid areas of Makueni district, Kenya, by examining whether, and how crop production conditions and agro-pastoral strategies predispose smallholder households to drought-triggered food insecurity. If this hypothesis holds, then approaches to deal with drought and famine have to target factors causing household food insecurity during non-drought periods. Data from a longitudinal survey of 127 households, interviews, workshops, and daily rainfall records (1961–2003) were analysed using quantitative and qualitative methods. This integrated approach confirms the above hypothesis and reveals that factors other than rainfall, like asset and labour constraints, inadequate policy enforcement, as well as the poverty-driven inability to adopt risk-averse production systems play a key role. When linking these factors to the high rainfall variability, farmer-relevant definitions and forecasts of drought have to be applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. Planet formation models have been developed during the past years to try to reproduce what has been observed of both the solar system and the extrasolar planets. Some of these models have partially succeeded, but they focus on massive planets and, for the sake of simplicity, exclude planets belonging to planetary systems. However, more and more planets are now found in planetary systems. This tendency, which is a result of radial velocity, transit, and direct imaging surveys, seems to be even more pronounced for low-mass planets. These new observations require improving planet formation models, including new physics, and considering the formation of systems. Aims: In a recent series of papers, we have presented some improvements in the physics of our models, focussing in particular on the internal structure of forming planets, and on the computation of the excitation state of planetesimals and their resulting accretion rate. In this paper, we focus on the concurrent effect of the formation of more than one planet in the same protoplanetary disc and show the effect, in terms of architecture and composition of this multiplicity. Methods: We used an N-body calculation including collision detection to compute the orbital evolution of a planetary system. Moreover, we describe the effect of competition for accretion of gas and solids, as well as the effect of gravitational interactions between planets. Results: We show that the masses and semi-major axes of planets are modified by both the effect of competition and gravitational interactions. We also present the effect of the assumed number of forming planets in the same system (a free parameter of the model), as well as the effect of the inclination and eccentricity damping. We find that the fraction of ejected planets increases from nearly 0 to 8% as we change the number of embryos we seed the system with from 2 to 20 planetary embryos. Moreover, our calculations show that, when considering planets more massive than ~5 M⊕, simulations with 10 or 20 planetary embryos statistically give the same results in terms of mass function and period distribution.