827 resultados para Multiple-scale processing
Resumo:
Small-scale farmers in the Brazilian Amazon collectively hold tenure over more than 12 million ha of permanent forest reserves, as required by the Forest Code. The trade-off between forest conservation and other land uses entails opportunity costs for them and for the country, which have not been sufficiently studied. We assessed the potential income generated by multiple use forest management for farmers and compared it to the income potentially derived from six other agricultural land uses. Income from the forest was from (i) logging, carried out by a logging company in partnership with farmers' associations; and (ii) harvesting the seeds of Carapa guianensis (local name andiroba) for the production of oil. We then compared the income generated by multiple-use forest management with the income from different types of agrarian systems. According to our calculations in this study, the mean annual economic benefits from multiple forest use are the same as the least productive agrarian system, but only 25% of the annual income generated by the most productive system. Although the income generated by logging may be considered low when calculated on an annual basis and compared to incomes generated by agriculture, the one-time payment after logging is significant (US$5,800 to US$33,508) and could be used to implement more intensive and productive cropping systems such as planting black pepper. The income from forest management could also be used to establish permanent fields in deforested areas for highly productive annual crops using conservation agriculture techniques. These techniques are alternatives to the traditional land use based on periodic clearing of the forest. Nevertheless, the shift in current practices towards adoption of more sustainable conservation agriculture techniques will also require the technical and legal support of the State to help small farmers apply these alternatives, which aim to integrate forest management in sustainable agricultural production systems.
Resumo:
Quality is a variable concept, which involves many factors, depending on the consumer market. In meat production, the concern with environmental aspects, animal welfare and the health and safety of workers is increasing. This work studied the effect of controlled atmosphere stunning of broilers on meat features and biochemical parameters for stress. Cobb broilers were stunned by electrical stunning and by controlled atmosphere with 70% CO2 and 70% CO2 + 30% Argon. After stunning, serum levels of glucose, lactate and corticosterone were compared with those of broilers at rest, immediately before transportation and slaughter and after 12 h of feed withdrawal (control group). At slaughter, blood volume drained during bleeding was not different for the stunning methods tested, ranging from 3.3 to 3.4% birds weight. This finding was important to demonstrate that gas stunning was not responsible for the animals′ death. Final pH in breast (6.1 to 6.2) and thigh (6.3 to 6.5) also did not vary among the different stunning methods (P > 0.05). Lightness (L = 60.55) and redness (a = +8.94) values found for breasts from electrical stunning showed that they were darker and redder (P < 0.05), probably due to changes in blood pressure. Glucose and corticosterone levels were not different between gas stunned birds (302.45 to 315.7 mg/dl and 55.71 to 72.49 ng/ml respectively) and birds at rest (305.95 mg/dl and 50.65 ng/ml) (P > 0.05). These stress indicators were higher (337.65 mg/dl for glucose and 104.13 ng/ml for costicosterone) when electrical stunning was used (P < 0.05). Lactate concentrations were lower (5.4 mmol/l) for birds at rest (P < 0.05) but not different for all stunning methods tested (7.3 to 8.1 mmol/l; P > 0.05). These results show that serum glucose may be used as a stress indicator in birds, with the advantage of being a quick and cheap biochemical test. Gas stunning favored birds′ management during slaughter and so reduced workers′ effort and injury hazard and the amount of feces and dust in the room. To make this method available for a large scale process, adjustments in equipments will be necessary to avoid delays in the processing line.
Resumo:
Previous studies support that regular physical activity in aging contributes as a protective factor against cognitive decline and improves mood states. However, there is a lack of longitudinal studies in this area. Objective: To observe possible changes in cognition related with physical activity. METHODS: This study reassessed, after one-year period, 31 elderly women divided into two groups, sedentary versus active, using behavioral scales and cognitive tests. RESULTS: The active group exhibited significantly enhanced performance in general cognitive function, particularly on tasks of episodic memory and praxis, and also on the mood states scale compared to the sedentary group. The active women also reported higher self-efficacy. CONCLUSION: Long-term physical activity promoted improvement on quality of life in the elderly women.
Resumo:
The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.
Resumo:
Objectives: To identify people affected by leprosy with impairments after completing multidrug therapy for leprosy, and to assess their limitations in conducting daily activities by applying the Screening of Activity Limitation and Safety Awareness (SALSA) scale. Methods: A cross-sectional study was performed of all residents of a medium-sized city who were treated for leprosy from 1998 to 2006. A specific questionnaire was applied to obtain general and clinical data and the SALSA scale was used to assess limitations in activities. Impairments were assessed using the 'World Health Organization leprosy disability grading system' (WHO-DG). Findings: Of the 335 people affected by leprosy treated in the period, 223 (62.1%) were located and interviewed. A total of 51.6% were female with a mean age of 54 years (SD +/- 15.72) and 67.9% had up to 6 years formal education. The borderline form predominated among interviewees (39.9%) and 54.3% suffer from associated diseases with hypertension (29.1%) and diabetes (10.3%) being the most common. Pain was reported by 54.7% of interviewees. By multiple logistic regression analysis, associations were found between limitations in activities and being female (P < 0.025), family income <= 3 minimum wages (P-value < 0.003), reports of major lesions (P-value < 0.004), pain (P-value < 0.001), associated diseases (P-value < 0.023) and the WHO-DG (P-value < 0.001). Disabilities, as identified using the WHO-DG, were less common (32%) than limitations in activities as evaluated by the SALSA scale (57.8%). Conclusion: Limitations in activities proved to be common in people affected by leprosy and were. associated with low income, being female, reported major lesions, disability, disease and pain.
Resumo:
Metalinguistic skill is the ability to reflect upon language as an object of thought. Amongst metalinguistic skills, two seem to be associated with reading and spelling: morphological awareness and phonological awareness. Phonological awareness is the ability of reflecting upon the phonemes that compose words, and morphological awareness is the ability of reflecting upon the morphemes that compose the words. The latter seems to be particularly important for reading comprehension and contextual reading, as beyond phonological information, syntactic and semantic information are required. This study is set to investigate - with a longitudinal design - the relation between those abilities and contextual reading measured by the Cloze test. The first part of the study explores the relationship between morphological awareness tasks and Cloze scores through simple correlations and, in the second part, the specificity of such relationship was inquired using multiple regressions. The results give some support to the hypothesis that morphological awareness offers an independent contribution regarding phonological awareness to contextual reading in Brazilian Portuguese.
Resumo:
This paper presents the development of a mathematical model to optimize the management and operation of the Brazilian hydrothermal system. The system consists of a large set of individual hydropower plants and a set of aggregated thermal plants. The energy generated in the system is interconnected by a transmission network so it can be transmitted to centers of consumption throughout the country. The optimization model offered is capable of handling different types of constraints, such as interbasin water transfers, water supply for various purposes, and environmental requirements. Its overall objective is to produce energy to meet the country's demand at a minimum cost. Called HIDROTERM, the model integrates a database with basic hydrological and technical information to run the optimization model, and provides an interface to manage the input and output data. The optimization model uses the General Algebraic Modeling System (GAMS) package and can invoke different linear as well as nonlinear programming solvers. The optimization model was applied to the Brazilian hydrothermal system, one of the largest in the world. The system is divided into four subsystems with 127 active hydropower plants. Preliminary results under different scenarios of inflow, demand, and installed capacity demonstrate the efficiency and utility of the model. From this and other case studies in Brazil, the results indicate that the methodology developed is suitable to different applications, such as planning operation, capacity expansion, and operational rule studies, and trade-off analysis among multiple water users. DOI: 10.1061/(ASCE)WR.1943-5452.0000149. (C) 2012 American Society of Civil Engineers.
Resumo:
Background: Proteinaceous toxins are observed across all levels of inter-organismal and intra-genomic conflicts. These include recently discovered prokaryotic polymorphic toxin systems implicated in intra-specific conflicts. They are characterized by a remarkable diversity of C-terminal toxin domains generated by recombination with standalone toxin-coding cassettes. Prior analysis revealed a striking diversity of nuclease and deaminase domains among the toxin modules. We systematically investigated polymorphic toxin systems using comparative genomics, sequence and structure analysis. Results: Polymorphic toxin systems are distributed across all major bacterial lineages and are delivered by at least eight distinct secretory systems. In addition to type-II, these include type-V, VI, VII (ESX), and the poorly characterized "Photorhabdus virulence cassettes (PVC)", PrsW-dependent and MuF phage-capsid-like systems. We present evidence that trafficking of these toxins is often accompanied by autoproteolytic processing catalyzed by HINT, ZU5, PrsW, caspase-like, papain-like, and a novel metallopeptidase associated with the PVC system. We identified over 150 distinct toxin domains in these systems. These span an extraordinary catalytic spectrum to include 23 distinct clades of peptidases, numerous previously unrecognized versions of nucleases and deaminases, ADP-ribosyltransferases, ADP ribosyl cyclases, RelA/SpoT-like nucleotidyltransferases, glycosyltranferases and other enzymes predicted to modify lipids and carbohydrates, and a pore-forming toxin domain. Several of these toxin domains are shared with host-directed effectors of pathogenic bacteria. Over 90 families of immunity proteins might neutralize anywhere between a single to at least 27 distinct types of toxin domains. In some organisms multiple tandem immunity genes or immunity protein domains are organized into polyimmunity loci or polyimmunity proteins. Gene-neighborhood-analysis of polymorphic toxin systems predicts the presence of novel trafficking-related components, and also the organizational logic that allows toxin diversification through recombination. Domain architecture and protein-length analysis revealed that these toxins might be deployed as secreted factors, through directed injection, or via inter-cellular contact facilitated by filamentous structures formed by RHS/YD, filamentous hemagglutinin and other repeats. Phyletic pattern and life-style analysis indicate that polymorphic toxins and polyimmunity loci participate in cooperative behavior and facultative 'cheating' in several ecosystems such as the human oral cavity and soil. Multiple domains from these systems have also been repeatedly transferred to eukaryotes and their viruses, such as the nucleo-cytoplasmic large DNA viruses. Conclusions: Along with a comprehensive inventory of toxins and immunity proteins, we present several testable predictions regarding active sites and catalytic mechanisms of toxins, their processing and trafficking and their role in intra-specific and inter-specific interactions between bacteria. These systems provide insights regarding the emergence of key systems at different points in eukaryotic evolution, such as ADP ribosylation, interaction of myosin VI with cargo proteins, mediation of apoptosis, hyphal heteroincompatibility, hedgehog signaling, arthropod toxins, cell-cell interaction molecules like teneurins and different signaling messengers.
Resumo:
We provide a detailed account of the spatial structure of the Brazilian sardine (Sardinella brasiliensis) spawning and nursery habitats, using ichthyoplankton data from nine surveys (1976-1993) covering the Southeastern Brazilian Bight (SBB). The spatial variability of sardine eggs and larvae was partitioned into predefined spatial-scale classes (broad scale, 200-500 km; medium scale, 50-100 km; and local scale, <50 km). The relationship between density distributions at both developmental stages and environmental descriptors (temperature and salinity) was also explored within these spatial scales. Spatial distributions of sardine eggs were mostly structured on medium and local scales, while larvae were characterized by broad-and medium-scale distributions. Broad-and medium-scale surface temperatures were positively correlated with sardine densities, for both developmental stages. Correlations with salinity were predominantly negative and concentrated on a medium scale. Broad-scale structuring might be explained by mesoscale processes, such as pulsing upwelling events and Brazil Current meandering at the northern portion of the SBB, while medium-scale relationships may be associated with local estuarine outflows. The results indicate that processes favouring vertical stability might regulate the spatial extensions of suitable spawning and nursery habitats for the Brazilian sardine.
Resumo:
Background: Clinical and sociodemographic findings have supported that OCD is heterogeneous and composed of multiple potentially overlapping and stable symptom dimensions. Previous neuroimaging investigations have correlated different patterns of OCD dimension scores and gray matter (GM) volumes. Despite their relevant contribution, some methodological limitations, such as patient's previous medication intake, may have contributed to inconsistent findings. Method: Voxel-based morphometry was used to investigate correlations between regional GM volumes and symptom dimensions severity scores in a sample of 38 treatment-naive OCD patients. Several standardized instruments were applied, including an interview exclusively developed for assessing symptom dimensions severity (DY-BOCS). Results: Scores on the "aggression" dimension were positively correlated with GM volumes in lateral parietal cortex in both hemispheres and negatively correlated with bilateral insula, left putamen and left inferior OFC. Scores on the "sexual/religious" dimension were positively correlated with GM volumes within the right middle lateral OFC and right DLPFC and negatively correlated with bilateral ACC. Scores on the "hoarding" dimension were positively correlated with GM volumes in the left superior lateral OFC and negatively correlated in the right parahippocampal gyrus. No significant correlations between GM volumes and the "contamination" or "symmetry" dimensions were found. Conclusions: Building upon preexisting findings, our data with treatment-naive OCD patients have demonstrated distinct GM substrates implicated in both cognitive and emotion processing across different OCS dimensions. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
[EN] The seminal work of Horn and Schunck [8] is the first variational method for optical flow estimation. It introduced a novel framework where the optical flow is computed as the solution of a minimization problem. From the assumption that pixel intensities do not change over time, the optical flow constraint equation is derived. This equation relates the optical flow with the derivatives of the image. There are infinitely many vector fields that satisfy the optical flow constraint, thus the problem is ill-posed. To overcome this problem, Horn and Schunck introduced an additional regularity condition that restricts the possible solutions. Their method minimizes both the optical flow constraint and the magnitude of the variations of the flow field, producing smooth vector fields. One of the limitations of this method is that, typically, it can only estimate small motions. In the presence of large displacements, this method fails when the gradient of the image is not smooth enough. In this work, we describe an implementation of the original Horn and Schunck method and also introduce a multi-scale strategy in order to deal with larger displacements. For this multi-scale strategy, we create a pyramidal structure of downsampled images and change the optical flow constraint equation with a nonlinear formulation. In order to tackle this nonlinear formula, we linearize it and solve the method iteratively in each scale. In this sense, there are two common approaches: one that computes the motion increment in the iterations, like in ; or the one we follow, that computes the full flow during the iterations, like in. The solutions are incrementally refined ower the scales. This pyramidal structure is a standard tool in many optical flow methods.
Resumo:
The continuous increase of genome sequencing projects produced a huge amount of data in the last 10 years: currently more than 600 prokaryotic and 80 eukaryotic genomes are fully sequenced and publically available. However the sole sequencing process of a genome is able to determine just raw nucleotide sequences. This is only the first step of the genome annotation process that will deal with the issue of assigning biological information to each sequence. The annotation process is done at each different level of the biological information processing mechanism, from DNA to protein, and cannot be accomplished only by in vitro analysis procedures resulting extremely expensive and time consuming when applied at a this large scale level. Thus, in silico methods need to be used to accomplish the task. The aim of this work was the implementation of predictive computational methods to allow a fast, reliable, and automated annotation of genomes and proteins starting from aminoacidic sequences. The first part of the work was focused on the implementation of a new machine learning based method for the prediction of the subcellular localization of soluble eukaryotic proteins. The method is called BaCelLo, and was developed in 2006. The main peculiarity of the method is to be independent from biases present in the training dataset, which causes the over‐prediction of the most represented examples in all the other available predictors developed so far. This important result was achieved by a modification, made by myself, to the standard Support Vector Machine (SVM) algorithm with the creation of the so called Balanced SVM. BaCelLo is able to predict the most important subcellular localizations in eukaryotic cells and three, kingdom‐specific, predictors were implemented. In two extensive comparisons, carried out in 2006 and 2008, BaCelLo reported to outperform all the currently available state‐of‐the‐art methods for this prediction task. BaCelLo was subsequently used to completely annotate 5 eukaryotic genomes, by integrating it in a pipeline of predictors developed at the Bologna Biocomputing group by Dr. Pier Luigi Martelli and Dr. Piero Fariselli. An online database, called eSLDB, was developed by integrating, for each aminoacidic sequence extracted from the genome, the predicted subcellular localization merged with experimental and similarity‐based annotations. In the second part of the work a new, machine learning based, method was implemented for the prediction of GPI‐anchored proteins. Basically the method is able to efficiently predict from the raw aminoacidic sequence both the presence of the GPI‐anchor (by means of an SVM), and the position in the sequence of the post‐translational modification event, the so called ω‐site (by means of an Hidden Markov Model (HMM)). The method is called GPIPE and reported to greatly enhance the prediction performances of GPI‐anchored proteins over all the previously developed methods. GPIPE was able to predict up to 88% of the experimentally annotated GPI‐anchored proteins by maintaining a rate of false positive prediction as low as 0.1%. GPIPE was used to completely annotate 81 eukaryotic genomes, and more than 15000 putative GPI‐anchored proteins were predicted, 561 of which are found in H. sapiens. In average 1% of a proteome is predicted as GPI‐anchored. A statistical analysis was performed onto the composition of the regions surrounding the ω‐site that allowed the definition of specific aminoacidic abundances in the different considered regions. Furthermore the hypothesis that compositional biases are present among the four major eukaryotic kingdoms, proposed in literature, was tested and rejected. All the developed predictors and databases are freely available at: BaCelLo http://gpcr.biocomp.unibo.it/bacello eSLDB http://gpcr.biocomp.unibo.it/esldb GPIPE http://gpcr.biocomp.unibo.it/gpipe
Resumo:
La gestione di sorgenti multiple di disturbo in AMP: il caso delle Isole Tremiti Il seguente lavoro di tesi valuta l’efficacia di protezione di un’area marina protetta (AMP) sui popolamenti di differenti habitat compresi in zone a diverso regime di tutela. Questo tema è molto sentito sia dal punto di vista scientifico, poiché le AMP rappresentano uno esperimento di esclusione delle attività antropiche ad ampia scala, sia dal punto di vista socio-economico per l’interesse che sono in grado di generare nelle comunità locali. Tuttavia, ad oggi, gli studi che abbiano dimostrato l’efficacia di protezione delle AMP sono pochi e sono per lo più diretti sulle specie di maggior interesse commerciale. In generale, c’è un’evidente mancanza di protezione in molte AMP del Mediterraneo e di aree extra-mediterranee che può essere attribuita a diversi fattori, tra cui le caratteristiche fisiche dei siti dove sono state istituite, le modalità di gestione e le numerose attività illegali che vengono svolte all’interno dei loro confini. Inoltre, nelle aree protette, spesso, anche le attività lecite non sono adeguatamente regolamentate, limitando ulteriormente il perseguimento degli obiettivi istitutivi e la tutela della biodiversità marina. Testare le ipotesi sull’efficacia di protezione delle AMP è, quindi, di fondamentale importanza per capire quali tipi di impatti sono maggiormente presenti e per poter fornire agli Enti gestori informazioni utili per migliorare l’amministrazione delle AMP. In particolare, l’AMP dell’arcipelago delle Isole Tremiti, istituita da oltre venti anni, è un’area protetta che presenta molte criticità, come dimostrato in precedenti campagne di monitoraggio condotte dal Consorzio Nazionale Interuniversitario per le Scienze del Mare (CoNISMa). In questo contesto, la presente tesi è stata sviluppata con lo scopo di quantificare l’effetto della regolamentazione di diverse attività umane sui popolamenti del subtidale, della frangia e delle praterie di Posidonia oceanica nell’AMP delle Isole Tremiti a diverse scale spaziali per un periodo di circa dieci anni. Questo lavoro, inoltre, rientra in un progetto finanziato dal Ministero dell’Ambiente e della Tutela del Territorio e del Mare al CoNISMa volto ad impostare un’attività di monitoraggio sperimentale e di mitigazione in questa AMP. I campionamenti sono stati condotti tra Giugno e Settembre 2010 e i dati raccolti sono stati integrati a quelli ottenuti nei precedenti monitoraggi svolti nelle Isole Tremiti. I risultati hanno mostrato che: 1) ci sono differenze significative consistenti nel tempo tra il subtidale dell’isola di Pianosa e quello delle altre isole dell’arcipelago; 2) i popolamenti nella frangia di Pianosa, di San Domino e di Caprara non presentano differenze significative; 3) c’è un’elevata variabilità a scala di sito nelle praterie di Posidonia oceanica, ma non si osserva una differenza tra località protette ed impattate. La differenza riscontrata nel subtidale tra zona a protezione integrale (Pianosa) e le altre isole dell’arcipelago (controlli) non è però attribuibile ad un effetto della protezione. Infatti, il subtidale di Pianosa è caratterizzato da un barren molto esteso con elevate percentuali di spugne rosse incrostanti, di alghe rodoficee incrostanti e di ricci di mare, mentre nelle isole di San Domino e di Caprara c’è una maggiore diversità data da alghe corallinacee articolate, alghe erette, idrozoi, ascidiacei e numerose spugne. Diversi fattori possono aver agito nel determinare questo risultato, ma molto probabilmente la cospicua attività di pesca illegale che viene praticata a Pianosa combinata all’attività di grazing degli erbivori, non controllati dai predatori, limita il recupero dei popolamenti. Al contrario, l’assenza di differenze nei popolamenti della frangia delle tre isole campionate fa ipotizzare la mancanza di impatti diretti (principalmente il calpestio) su questo habitat. Per quanto riguarda la Posidonia oceanica i risultati suggeriscono che si stia verificando un ancoraggio indiscriminato su tutte le praterie delle Isole Tremiti e che molto probabilmente si tratta di praterie in forte regressione, come indicano anche le ricerche condotte dall’Università di Bari. C’è bisogno, tuttavia, di ulteriori studi che aiutino a comprendere meglio la variabilità nella riposta dei popolamenti in relazione alle diverse condizioni ambientali e al diverso sforzo di gestione. In conclusione, dai risultati ottenuti, emerge chiaramente come anche nell’AMP delle Isole Tremiti, ci sia una scarsa efficacia di protezione, così come è stato rilevato per la maggior parte delle AMP italiane. Per risolvere le costanti conflittualità che perdurano nelle Isole Tremiti e che non permettono il raggiungimento degli obiettivi istitutivi dell’AMP, è assolutamente necessario, oltre che far rispettare la regolamentazione vigente incrementando il numero di guardacoste sull’isola durante tutto l’anno, procedere, eventualmente, ad una rizonizzazione dell’AMP e sviluppare un piano di gestione in accordo con la popolazione locale adeguatamente sensibilizzata. Solo in questo modo sarà possibile ridurre le numerose attività illegali all’interno dell’AMP, e, allo stesso tempo, rendere gli stessi cittadini una componente imprescindibile della conservazione di questo arcipelago.
Resumo:
Array seismology is an useful tool to perform a detailed investigation of the Earth’s interior. Seismic arrays by using the coherence properties of the wavefield are able to extract directivity information and to increase the ratio of the coherent signal amplitude relative to the amplitude of incoherent noise. The Double Beam Method (DBM), developed by Krüger et al. (1993, 1996), is one of the possible applications to perform a refined seismic investigation of the crust and mantle by using seismic arrays. The DBM is based on a combination of source and receiver arrays leading to a further improvement of the signal-to-noise ratio by reducing the error in the location of coherent phases. Previous DBM works have been performed for mantle and core/mantle resolution (Krüger et al., 1993; Scherbaum et al., 1997; Krüger et al., 2001). An implementation of the DBM has been presented at 2D large-scale (Italian data-set for Mw=9.3, Sumatra earthquake) and at 3D crustal-scale as proposed by Rietbrock & Scherbaum (1999), by applying the revised version of Source Scanning Algorithm (SSA; Kao & Shan, 2004). In the 2D application, the rupture front propagation in time has been computed. In 3D application, the study area (20x20x33 km3), the data-set and the source-receiver configurations are related to the KTB-1994 seismic experiment (Jost et al., 1998). We used 60 short-period seismic stations (200-Hz sampling rate, 1-Hz sensors) arranged in 9 small arrays deployed in 2 concentric rings about 1 km (A-arrays) and 5 km (B-array) radius. The coherence values of the scattering points have been computed in the crustal volume, for a finite time-window along all array stations given the hypothesized origin time and source location. The resulting images can be seen as a (relative) joint log-likelihood of any point in the subsurface that have contributed to the full set of observed seismograms.
Resumo:
A year of satellite-borne lidar CALIOP data is analyzed and statistics on occurrence and distribution of bulk properties of cirri are provided. The relationship between environmental and cloud physical parameters and the shape of the backscatter profile (BSP) is investigated. It is found that CALIOP BSP is mainly affected by cloud geometrical thickness while only minor impacts can be attributed to other quantities such as optical depth or temperature. To fit mean BSPs as functions of geometrical thickness and position within the cloud layer, polynomial functions are provided. It is demonstrated that, under realistic hypotheses, the mean BSP is linearly proportional to the IWC profile. The IWC parameterization is included into the RT-RET retrieval algorithm, that is exploited to analyze infrared radiance measurements in presence of cirrus clouds during the ECOWAR field campaign. Retrieved microphysical and optical properties of the observed cloud are used as input parameters in a forward RT simulation run over the 100-1100 cm-1 spectral interval and compared with interferometric data to test the ability of the current single scattering properties database of ice crystal to reproduce realistic optical features. Finally a global scale investigation of cirrus clouds is performed by developing a collocation algorithm that exploits satellite data from multiple sensors (AIRS, CALIOP, MODIS). The resulting data set is utilized to test a new infrared hyperspectral retrieval algorithm. Retrieval products are compared to data and in particular the cloud top height (CTH) product is considered for this purpose. A better agreement of the retrieval with the CALIOP CTH than MODIS is found, even if some cases of underestimation and overestimation are observed.