916 resultados para gap filling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Article analyzes the recognition and enforcement of cross-border insolvency judgments from the United States, United Kingdom, and Australia to determine whether the UNCITRAL Model Law’s goal of modified universalism is currently being practiced, and subjects the Model Law to analysis through the lens of international relations theories to elaborate a way forward. We posit that courts could use the express language of the Model Law text to confer recognition and enforcement of foreign insolvency judgments. The adoption of our proposal will reduce costs, maximize recovery for creditors, and ensure predictability for all parties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term measurements of CO2 flux can be obtained using the eddy covariance technique, but these datasets are affected by gaps which hinder the estimation of robust long-term means and annual ecosystem exchanges. We compare results obtained using three gap-fill techniques: multiple regression (MR), multiple imputation (MI), and artificial neural networks (ANNs), applied to a one-year dataset of hourly CO2 flux measurements collected in Lutjewad, over a flat agriculture area near the Wadden Sea dike in the north of the Netherlands. The dataset was separated in two subsets: a learning and a validation set. The performances of gap-filling techniques were analysed by calculating statistical criteria: coefficient of determination (R2), root mean square error (RMSE), mean absolute error (MAE), maximum absolute error (MaxAE), and mean square bias (MSB). The gap-fill accuracy is seasonally dependent, with better results in cold seasons. The highest accuracy is obtained using ANN technique which is also less sensitive to environmental/seasonal conditions. We argue that filling gaps directly on measured CO2 fluxes is more advantageous than the common method of filling gaps on calculated net ecosystem change, because ANN is an empirical method and smaller scatter is expected when gap filling is applied directly to measurements.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The primary goal of systems biology is to integrate complex omics data, and data obtained from traditional experimental studies in order to provide a holistic understanding of organismal function. One way of achieving this aim is to generate genome-scale metabolic models (GEMs), which contain information on all metabolites, enzyme-coding genes, and biochemical reactions in a biological system. Drosophila melanogaster GEM has not been reconstructed to date. Constraint-free genome-wide metabolic model of the fruit fly has been reconstructed in our lab, identifying gaps, where no enzyme was identified and metabolites were either only produced or consume. The main focus of the work presented in this thesis was to develop a pipeline for efficient gap filling using metabolomics approaches combined with standard reverse genetics methods, using 5-hydroxyisourate hydrolase (5-HIUH) as an example. 5-HIUH plays a role in urate degradation pathway. Inability to degrade urate can lead to inborn errors of metabolism (IEMs) in humans, including hyperuricemia. Based on sequence analysis Drosophila CG30016 gene was hypothesised to encode 5- HIUH. CG30016 knockout flies were examined to identify Malpighian tubules phenotype, and shortened lifespan might reflect kidney disorders in hyperuricemia in humans. Moreover, LC-MS analysis of mutant tubules revealed that CG30016 is involved in purine metabolism, and specifically urate degradation pathway. However, the exact role of the gene has not been identified, and the complete method for gap filling has not been developed. Nevertheless, thanks to the work presented here, we are a step closer towards the development of a gap-filling pipeline in Drosophila melanogaster GEM. Importantly, the areas that require further optimisation were identified and are the focus of future research. Moreover, LC-MS analysis confirmed that tubules rather than the whole fly were more suitable for metabolomics analysis of purine metabolism. Previously, Dow/Davies lab has generated the most complete tissue-specific transcriptomic atlas for Drosophila – FlyAtlas.org, which provides data on gene expression across multiple tissues of adult fly and larva. FlyAtlas revealed that transcripts of many genes are enriched in specific Drosophila tissues, and that it is possible to deduce the functions of individual tissues within the fly. Based on FlyAtlas data, it has become clear that the fly (like other metazoan species) must be considered as a set of tissues, each 2 with its own distinct transcriptional and functional profile. Moreover, it revealed that for about 30% of the genome, reverse genetic methods (i.e. mutation in an unknown gene followed by observation of phenotype) are only useful if specific tissues are investigated. Based on the FlyAtlas findings, we aimed to build a primary tissue-specific metabolome of the fruit fly, in order to establish whether different Drosophila tissues have different metabolomes and if they correspond to tissue-specific transcriptome of the fruit fly (FlyAtlas.org). Different fly tissues have been dissected and their metabolome elucidated using LC-MS. The results confirmed that tissue metabolomes differ significantly from each other and from the whole fly, and that some of these differences can be correlated to the tissue function. The results illustrate the need to study individual tissues as well as the whole organism. It is clear that some metabolites that play an important role in a given tissue might not be detected in the whole fly sample because their abundance is much lower in comparison to other metabolites present in all tissues, which prevent the detection of the tissue-specific compound.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We measured the net ecosystem CO2 exchange (NEE) in an alpine meadow ecosystem (latitude 37degrees29'-45'N, longitude 101degrees12'-23'E, 3250 m above sea level) on the Qinghai-Tibetan Plateau throughout 2002 by the eddy covariance method to examine the carbon dynamics and budget on this unique plateau. Diurnal changes in gross primary production (GPP) and ecosystem respiration (R-e) showed that an afternoon increase of NEE was highly associated with an increase of R-e. Seasonal changes in GPP corresponded well to changes in the leaf area index and daily photosynthetic photon flux density. The ratio of GPP/R-e was high and reached about 2.0 during the peak growing season, which indicates that mainly autotrophic respiration controlled the carbon dynamics of the ecosystem. Seasonal changes in mean GPP and R-e showed compensatory behavior as reported for temperate and Mediterranean ecosystems, but those of GPP(max) and R-emax were poorly synchronized. The alpine ecosystem exhibited lower GPP (575 g C m(-2) y(-1)) than, but net ecosystem production (78.5 g C m(-2) y(-1)) similar to, that of subalpine forest ecosystems. The results suggest that the alpine meadow behaved as a CO2 sink during the 1-year measurement period but apparently sequestered a rather small amount of C in comparison with similar alpine ecosystems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We provide a select overview of tools supporting traditional Jewish learning. Then we go on to discuss our own HyperJoseph/HyperIsaac project in instructional hypermedia. Its application is to teaching, teacher training, and self-instruction in given Bible passages. The treatment of two narratives has been developed thus far. The tool enables an analysis of the text in several respects: linguistic, narratological, etc. Moreover, the Scriptures' focality throughout the cultural history makes this domain of application particularly challenging, in that there is a requirement for the tool to encompass the accretion of receptions in the cultural repertoire, i.e., several layers of textual traditions—either hermeneutic (i.e., interpretive), or appropriations—related to the given core passage, thus including "secondary" texts (i.e., such that are responding or derivative) from as disparate realms as Roman-age and later homiletics, Medieval and later commentaries or supercommentaries, literary appropriations, references to the arts and modern scholarship, etc. in particular, the Midrash (homiletic expansions) is adept at narrative gap filling, so the narratives mushroom at the interstices where the primary text is silent. The genealogy of the project is rooted in Weiss' index of novelist Agnon's writings, which was eventually upgraded into a hypertextual tool, including Agnon's full-text and ancillary materials. Those early tools being intended primarily for reference and research-support in literary studies, the Agnon hypertext system was initially emulated in the conception of HyperJoseph, which is applied to the Joseph story from Genesis. Then, the transition from a tool for reference to an instructional tool required a thorough reconception in an educational perspective, which led to HyperIsaac, on the sacrifice of Isaac, and to a redesign and upgrade of HyperJoseph as patterned after HyperIsaac.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The investigation which employed the action research method (qualitative analysis)was divided into four fases. In phases 1-3 the participants were six double bass students at Nossa Senhora do Cabo Music School. Pilot exercises in creativity were followed by broader and more ambitious projects. In phase 4 the techniques were tested and amplified during a summer course for twelve double bass students at Santa Cecilia College.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The English article system is actually so complex that it presents many challenges for most non-native learners of English. The main difficulty of Portuguese learners, despite the numerous similarities between the two article systems, is noticeable in a marked tendency to produce the definite article where native speakers of English would not use it. This article reports the results of a cross-sectional study which examined the English definite article overproduction by a group of 12 Portuguese EFL learners with at least seven years of English instruction. The prediction is that these learners will exhibit evidence of transferring L1 features to their interlanguage when they overuse the definite article. The data were collected by means of a gap-filling task and a composition. The results found, as predicted, that these learners overused the in generic contexts. It is argued that this overuse is directly tied to and can be explained by transfer to somewhere and conceptual transfer principles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Researchers analyzing spatiotemporal or panel data, which varies both in location and over time, often find that their data has holes or gaps. This thesis explores alternative methods for filling those gaps and also suggests a set of techniques for evaluating those gap-filling methods to determine which works best.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To investigate parameters related to fluency, reading comprehension and phonological processing (operational and short-term memory) and identify potential correlation between the variables in Dyslexia and in the absence of reading difficulties.Method: One hundred and fifteen students from the third to eighth grade of elementary school were grouped into a Control Group (CG) and Group with Dyslexia (GDys). Reading of words, pseudowords and text (decoding); listening and reading comprehension; phonological short-term and working memory (repetition of pseudowords and Digit Span) were evaluated.Results: The comparison of the groups showed significant differences in decoding, phonological short-term memory (repetition of pseudowords) and answers to text-connecting questions (TC) on reading comprehension, with the worst performances identified for GDys. In this group there were negative correlations between pseudowords repetition and TC answers and total score, both on listening comprehension. No correlations were found between operational and short-term memory (Digit Span) and parameters of fluency and reading comprehension in dyslexia. For the sample without complaint, there were positive correlations between some parameters of reading fluency and repetition of pseudowords and also between answering literal questions in listening comprehension and repetition of digits on the direct and reverse order. There was no correlation with the parameters of reading comprehension.Conclusion: GDys and CG showed similar performance in listening comprehension and in understanding of explicit information and gap-filling inference on reading comprehension. Students of GDys showed worst performance in reading decoding, phonological short-term memory (pseudowords) and on inferences that depends on textual cohesion understanding in reading. There were negative correlations between pseudowords repetition and TC answers and total score, both in listening comprehension.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Geografia - FCT

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent studies in animals have shown pronounced resorption of the buccal bone plate after immediate implantation. The use of flapless surgical procedures prior to the installation of immediate implants, as well as the use of synthetic bone graft in the gaps represent viable alternatives to minimize buccal bone resorption and to favor osseointegration. The aim of this study was to evaluate the healing of the buccal bone plate following immediate implantation using the flapless approach, and to compare this process with sites in which a synthetic bone graft was or was not inserted into the gap between the implant and the buccal bone plate. Lower bicuspids from 8 dogs were bilaterally extracted without the use of flaps, and 4 implants were installed in the alveoli in each side of the mandible and were positioned 2.0 mm from the buccal bone plate (gap). Four groups were devised: 2.0-mm subcrestal implants (3.3 x 8 mm) using bone grafts (SCTG), 2.0-mm subcrestal implants without bone grafts (SCCG), equicrestal implants (3.3 x 10 mm) with bone grafts (EGG), and equicrestal implants without bone grafts (ECCG). One week following the surgical procedures, metallic prostheses were installed, and within 12 weeks the dogs were sacrificed. The blocks containing the individual implants were turned sideways, and radiographic imaging was obtained to analyze the remodeling of the buccal bone plate. In the analysis of the resulting distance between the implant shoulder and the bone crest, statistically significant differences were found in the SCTG when compared to the ECTG (P = .02) and ECCG (P = .03). For mean value comparison of the resulting linear distance between the implant surface and the buccal plate, no statistically significant difference was found among all groups (P > .05). The same result was observed in the parameter for presence or absence of tissue formation between the implant surface and buccal plate. Equicrestally placed implants, in this methodology, presented little or no loss of the buccal bone. The subcrestally positioned implants presented loss of buccal bone, even though synthetic bone graft was used. The buccal bone, however, was always coronal to the implant shoulder.