877 resultados para Technical difficulties
Resumo:
A pivotal problem in Bayesian nonparametrics is the construction of prior distributions on the space M(V) of probability measures on a given domain V. In principle, such distributions on the infinite-dimensional space M(V) can be constructed from their finite-dimensional marginals---the most prominent example being the construction of the Dirichlet process from finite-dimensional Dirichlet distributions. This approach is both intuitive and applicable to the construction of arbitrary distributions on M(V), but also hamstrung by a number of technical difficulties. We show how these difficulties can be resolved if the domain V is a Polish topological space, and give a representation theorem directly applicable to the construction of any probability distribution on M(V) whose first moment measure is well-defined. The proof draws on a projective limit theorem of Bochner, and on properties of set functions on Polish spaces to establish countable additivity of the resulting random probabilities.
Resumo:
So far, there is no methods of logging interpretation effective enough to identify a low resistivity payzone since its resistivity value almost equals to that of an aquifer although many low-resistivity payzones have been found in lots of petroliferous basins worldwide. After a thorough study on those technical difficulties of the logging interpretation for the low-resistivity payzones, some corresponding resolutions have been put forward in this paper. In order to reveal its microscopic mechanism, researches on the discovered low-resistivity payzones have been carried on with analyses of core and lab test data, thus main influencing factors of the low-resistivity reservoirs have been pointed out including conductivity minerals, clay minerals, fluids, porosity and pore structure. In order to make clear the degree of influence of those reservoir factors on resistivity logging(log), lab studies and numeral simulations have been done with the typical core and formation water samples, therefore, their influence degrees have ascertained quantitatively or semi-quantitatively. The distribution law and possible distribution areas of the low-resistivity payzones in Jiyang Depression have been figured out firstly after the macroscopic geology origins (sedimentation, dynamic accumulation process, diagenesis etc.) in the area have been studied. In order to resolve the problem of difficult logging-interpretation, methods of interpretation and identification have been brought forward creatively according to the low-resistivity payzone type ascribed to macroscopic geology laws and to the combined features of logging traces, after a systemic summary of different responses of logging caused by different microscopic mechanism. Those methods have been applied in Dongying and Huimin Sag of Shengli Exploration Area, precision of identification of the low-resistivity payzones improved effectively and good economic attraction prove their great prospect.
Resumo:
A Time of flight (ToF) mass spectrometer suitable in terms of sensitivity, detector response and time resolution, for application in fast transient Temporal Analysis of Products (TAP) kinetic catalyst characterization is reported. Technical difficulties associated with such application as well as the solutions implemented in terms of adaptations of the ToF apparatus are discussed. The performance of the ToF was validated and the full linearity of the specific detector over the full dynamic range was explored in order to ensure its applicability for the TAP application. The reported TAP-ToF setup is the first system that achieves the high level of sensitivity allowing monitoring of the full 0-200 AMU range simultaneously with sub-millisecond time resolution. In this new setup, the high sensitivity allows the use of low intensity pulses ensuring that transport through the reactor occurs in the Knudsen diffusion regime and that the data can, therefore, be fully analysed using the reported theoretical TAP models and data processing.
Resumo:
Paralytic shellfish poisoning (PSP) toxins are produced by certain marine dinoflagellates and may accumulate in bivalve molluscs through filter feeding. The Mouse Bioassay (MBA) is the internationally recognised reference method of analysis, but it is prone to technical difficulties and regarded with increasing disapproval due to ethical reasons. As such, alternative methods are required. A rapid surface plasmon resonance (SPR) biosensor inhibition assay was developed to detect PSP toxins in shellfish by employing a saxitoxin polyclonal antibody (R895). Using an assay developed for and validated on the Biacore Q biosensor system, this project focused on transferring the assay to a high-throughput, Biacore T100 biosensor in another laboratory. This was achieved using a prototype PSP toxin kit and recommended assay parameters based on the Biacore Q method. A monoclonal antibody (GT13A) was also assessed. Even though these two instruments are based on SPR principles, they vary widely in their mode of operation including differences in the integrated mu-fluidic cartridges, autosampler system, and sensor chip compatibilities. Shellfish samples (n = 60), extracted using a simple, rapid procedure, were analysed using each platform, and results were compared to AOAC high performance liquid chromatography (HPLC) and MBA methods. The overall agreement, based on statistical 2 x 2 comparison tables, between each method ranged from 85% to 94.4% using R895 and 77.8% to 100% using GT13A. The results demonstrated that the antibody based assays with high sensitivity and broad specificity to PSP toxins can be applied to different biosensor platforms. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We report the detection of Voigt spectral line profiles of radio recombination lines (RRLs) toward Sagittarius B2(N) with the 100 m Green Bank Telescope (GBT). At radio wavelengths, astronomical spectra are highly populated with RRLs, which serve as ideal probes of the physical conditions in molecular cloud complexes. An analysis of the Hn alpha lines presented herein shows that RRLs of higher principal quantum number (n > 90) are generally divergent from their expected Gaussian profiles and, moreover, are well described by their respective Voigt profiles. This is in agreement with the theory that spectral lines experience pressure broadening as a result of electron collisions at lower radio frequencies. Given the inherent technical difficulties regarding the detection and profiling of true RRL wing spans and shapes, it is crucial that the observing instrumentation produce flat baselines as well as high-sensitivity, high-resolution data. The GBT has demonstrated its capabilities regarding all of these aspects, and we believe that future observations of RRL emission via the GBT will be crucial toward advancing our knowledge of the larger-scale extended structures of ionized gas in the interstellar medium (ISM).
Resumo:
Tese de mestrado, Educação (Didáctica da Matemática), Universidade de Lisboa, Instituto de Educação, 2010
Resumo:
The study of AC losses in superconducting pancake coils is of utmost importance for the development of superconducting devices. Due to different technical difficulties this study is usually performed considering one of two approaches: considering superconducting coils of few turns and studying AC losses in a large frequency range vs. superconducting coils with a large number of turns but measuring AC losses only in low frequencies. In this work, a study of AC losses in 128 turn superconducting coils is performed, considering frequencies ranging from 50 Hz till 1152 Hz and currents ranging from zero till the critical current of the coils. Moreover, the study of AC losses considering two different simultaneous harmonic components is also performed and results are compared to the behaviour presented by the coils when operating in a single frequency regime. Different electrical methods are used to verify the total amount of AC losses in the coil and a simple calorimetric method is presented, in order to measure AC losses in a multi-harmonic context. Different analytical and numerical methods are implemented and/or used, to design the superconducting coils and to compute the total amount of AC losses in the superconducting system and a comparison is performed to verify the advantages and drawbacks of each method.
Resumo:
The main purpose ofthis study was to examine the effect ofintention on the sleep onset process from an electrophysiological point ofview. To test this, two nap conditions, the Multiple Sleep Latency Test (MSLT) and the Repeated Test of Sustained Wakefulness (RTSW) were used to compare intentional and inadvertent sleep onset. Sixteen female participants (aged 19-25) spent two non-consecutive nights in the sleep lab; however, due to physical and technical difficulties only 8 participants produced compete sets of data for analysis. Each night participants were given six nap opportunities. For three ofthese naps they were instructed to fall asleep (MSLT), for the remaining three naps they were to attempt to remain awake (RTSW). These two types of nap opportunities represented the conditions ofintentional (MSLT) and inadvertent (RTSW) sleep onset. Several other sleepiness, performance, arousal and questionnaire measures were obtained to evaluate and/or control for demand characteristics, subjective effort and mental activity during the nap tests. The nap opportunities were scored using a new 9 stage scoring system developed by Hori et al. (1994). Power spectral analyses (FFT) were also performed on the sleep onset data provided by the two nap conditions. Longer sleep onset latencies (approximately 1.25 minutes) were obseIVed in the RTSW than the MSLT. A higher incidence of structured mental activity was reported in the RTSW and may have been reflected in higher Beta power during the RTSW. The decent into sleep was more ragged in the RTSW as evidenced by an increased number shifts towards higher arousal as measured using the Hori 9 stage sleep scoring method. 1ll The sleep onset process also appears to be altered by the intention to remain awake, at least until the point ofinitial Stage 2 sleep (i.e. the first appearance of spindle activity). When only examining the final 4.3 minutes ofthe sleep onset process (ending with spindle activity), there were significant interactions between the type ofnap and the time until sleep onset for Theta, Alpha and Beta power. That is to say, the pattern of spectral power measurements in these bands differed across time as a function ofthe type ofnap. The effect ofintention however, was quite small (,,2 < .04) when compared to the variance which could be accounted for by the passage oftime (,,2 == .10 to .59). These data indicate that intention alone cannot greatly extend voluntary wakefulness if a person is sleepy. This has serious implications for people who may be required to perform dangerous tasks while sleepy, particularly for people who are in a situation that does not allow them the opportunity to engage in behavioural strategies in order to maintain their arousal.
Resumo:
Travail dirigé présenté à la Faculté des études supérieures et postdoctorales en vue de l’obtention du grade de Maître ès sciences (M.Sc) en Criminologie – Option sécurité intérieure
Resumo:
This white paper reports emerging findings at the end of Phase I of the Lean Aircraft Initiative in the Policy focus group area. Specifically, it provides details about research on program instability. Its objective is to discuss high-level findings detailing: 1) the relative contribution of different factors to a program’s overall instability; 2) the cost impact of program instability on acquisition programs; and 3) some strategies recommended by program managers for overcoming and/or mitigating the negative effects of program instability on their programs. Because this report comes as this research is underway, this is not meant to be a definitive document on the subject. Rather, is it anticipated that this research may potentially produce a number of reports on program instability-related topics. The government managers of military acquisition programs rated annual budget or production rate changes, changes in requirements, and technical difficulties as the three top contributors, respectively, to program instability. When asked to partition actual variance in their program’s planned cost and schedule to each of these factors, it was found that the combined effects of unplanned budget and requirement changes accounted for 5.2% annual cost growth and 20% total program schedule slip. At a rate of approximately 5% annual cost growth from these factors, it is easy to see that even conservative estimates of the cost benefits to be gained from acquisition reforms and process improvements can quickly be eclipsed by the added cost associated with program instability. Program management practices involving the integration of stakeholders from throughout the value chain into the decision making process were rated the most effective at avoiding program instability. The use of advanced information technologies was rated the most effective at mitigating the negative impact of program instability.
Resumo:
This white paper reports emerging findings at the end of Phase I of the Lean Aircraft Initiative in the Policy focus group area. Specifically, it provides details about research on program instability. Its objective is to discuss high-level findings detailing: 1) the relative contribution of different factors to a program’s overall instability; 2) the cost impact of program instability on acquisition programs; and 3) some strategies recommended by program managers for overcoming and/or mitigating the negative effects of program instability on their programs. Because this report comes as this research is underway, this is not meant to be a definitive document on the subject. Rather, is it anticipated that this research may potentially produce a number of reports on program instability-related topics. The government managers of military acquisition programs rated annual budget or production rate changes, changes in requirements, and technical difficulties as the three top contributors, respectively, to program instability. When asked to partition actual variance in their program’s planned cost and schedule to each of these factors, it was found that the combined effects of unplanned budget and requirement changes accounted for 5.2% annual cost growth and 20% total program schedule slip. At a rate of approximately 5% annual cost growth from these factors, it is easy to see that even conservative estimates of the cost benefits to be gained from acquisition reforms and process improvements can quickly be eclipsed by the added cost associated with program instability. Program management practices involving the integration of stakeholders from throughout the value chain into the decision making process were rated the most effective at avoiding program instability. The use of advanced information technologies was rated the most effective at mitigating the negative impact of program instability.
Resumo:
The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.
Resumo:
The slow advective-timescale dynamics of the atmosphere and oceans is referred to as balanced dynamics. An extensive body of theory for disturbances to basic flows exists for the quasi-geostrophic (QG) model of balanced dynamics, based on wave-activity invariants and nonlinear stability theorems associated with exact symmetry-based conservation laws. In attempting to extend this theory to the semi-geostrophic (SG) model of balanced dynamics, Kushner & Shepherd discovered lateral boundary contributions to the SG wave-activity invariants which are not present in the QG theory, and which affect the stability theorems. However, because of technical difficulties associated with the SG model, the analysis of Kushner & Shepherd was not fully nonlinear. This paper examines the issue of lateral boundary contributions to wave-activity invariants for balanced dynamics in the context of Salmon's nearly geostrophic model of rotating shallow-water flow. Salmon's model has certain similarities with the SG model, but also has important differences that allow the present analysis to be carried to finite amplitude. In the process, the way in which constraints produce boundary contributions to wave-activity invariants, and additional conditions in the associated stability theorems, is clarified. It is shown that Salmon's model possesses two kinds of stability theorems: an analogue of Ripa's small-amplitude stability theorem for shallow-water flow, and a finite-amplitude analogue of Kushner & Shepherd's SG stability theorem in which the ‘subsonic’ condition of Ripa's theorem is replaced by a condition that the flow be cyclonic along lateral boundaries. As with the SG theorem, this last condition has a simple physical interpretation involving the coastal Kelvin waves that exist in both models. Salmon's model has recently emerged as an important prototype for constrained Hamiltonian balanced models. The extent to which the present analysis applies to this general class of models is discussed.
Resumo:
Licuri is a palm tree from the semiarid regions of Bahia State, Brazil. It is an important source of food and feed in that region, since their nuts are commonly eaten by humans and used as maize substitute for poultry feeding. The aim of this dissertation is to study the feasibility for use of natural convection solar dryers and forced being compared with the traditional drying outdoors for drying coconut licuri Syagrus coronate. The study led to the construction of two prototype solar dryer for carrying out experiments proving: model Solar Drying System Direct Exposure to Natural Convection built with wood, has a drying chamber with direct cover transparent glass laminates 4 mm, using techniques for proper isolation of the drying chamber. The two prototypes were comparatively analyzed for performance and drying efficiency with traditional extractive use by the community. Were evaluated the variables: time and drying rates and quality of the final samples of coconut licuri. The fruits were harvested and brought the town of Ouricuri, in the city of Caldeirão Grande, BA for the experiments comparing the three methods of drying was used a standard load of 4.0 kg The quantitative analysis for the result of the drying rate was found in 74% yield and 44% for natural and forced convection respectively compared with the traditional drying. These drying rates represent variation 3-5 times lower. Drying using forced convection licuri showed better quality, was found in a reddish pulp, representing the quantities that were kept of the nutrient beta carotene, and not notice the flavor change from the previous system, the final cost of construction of this system were higher . The prototypes built competitive advantage and had testified fully to resolve the technical difficulties previously encountered in the production of products made of coconut licuri. Allowing add value and increase their potential use for the fruit extractive communities of semi-arid region of Bahia
Resumo:
BackgroundDetection and quantification of hepatitis C virus (HCV) RNA is integral to diagnostic and therapeutic regimens. All molecular assays target the viral 5'-noncoding region (59-NCR), and all show genotype-dependent variation of sensitivities and viral load results. Non-western HCV genotypes have been under-represented in evaluation studies. An alternative diagnostic target region within the HCV genome could facilitate a new generation of assays.Methods and FindingsIn this study we determined by de novo sequencing that the 3'-X-tail element, characterized significantly later than the rest of the genome, is highly conserved across genotypes. To prove its clinical utility as a molecular diagnostic target, a prototype qualitative and quantitative test was developed and evaluated multicentrically on a large and complete panel of 725 clinical plasma samples, covering HCV genotypes 1-6, from four continents (Germany, UK, Brazil, South Africa, Singapore). To our knowledge, this is the most diversified and comprehensive panel of clinical and genotype specimens used in HCV nucleic acid testing (NAT) validation to date. The lower limit of detection (LOD) was 18.4 IU/ml (95% confidence interval, 15.3-24.1 IU/ml), suggesting applicability in donor blood screening. The upper LOD exceeded 10(-9) IU/ml, facilitating viral load monitoring within a wide dynamic range. In 598 genotyped samples, quantified by Bayer VERSANT 3.0 branched DNA (bDNA), X-tail-based viral loads were highly concordant with bDNA for all genotypes. Correlation coefficients between bDNA and X-tail NAT, for genotypes 1-6, were: 0.92, 0.85, 0.95, 0.91, 0.95, and 0.96, respectively; X-tail-based viral loads deviated by more than 0.5 log10 from 5'-NCR-based viral loads in only 12% of samples (maximum deviation, 0.85 log10). The successful introduction of X-tail NAT in a Brazilian laboratory confirmed the practical stability and robustness of the X-tail-based protocol. The assay was implemented at low reaction costs (US$8.70 per sample), short turnover times (2.5 h for up to 96 samples), and without technical difficulties.ConclusionThis study indicates a way to fundamentally improve HCV viral load monitoring and infection screening. Our prototype assay can serve as a template for a new generation of viral load assays. Additionally, to our knowledge this study provides the first open protocol to permit industry-grade HCV detection and quantification in resource-limited settings.