15 resultados para algebraic number field

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Localization is information of fundamental importance to carry out various tasks in the mobile robotic area. The exact degree of precision required in the localization depends on the nature of the task. The GPS provides global position estimation but is restricted to outdoor environments and has an inherent imprecision of a few meters. In indoor spaces, other sensors like lasers and cameras are commonly used for position estimation, but these require landmarks (or maps) in the environment and a fair amount of computation to process complex algorithms. These sensors also have a limited field of vision. Currently, Wireless Networks (WN) are widely available in indoor environments and can allow efficient global localization that requires relatively low computing resources. However, the inherent instability in the wireless signal prevents it from being used for very accurate position estimation. The growth in the number of Access Points (AP) increases the overlap signals areas and this could be a useful means of improving the precision of the localization. In this paper we evaluate the impact of the number of Access Points in mobile nodes localization using Artificial Neural Networks (ANN). We use three to eight APs as a source signal and show how the ANNs learn and generalize the data. Added to this, we evaluate the robustness of the ANNs and evaluate a heuristic to try to decrease the error in the localization. In order to validate our approach several ANNs topologies have been evaluated in experimental tests that were conducted with a mobile node in an indoor space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared the results of reverse transcription-polymerase chain reaction (RT-PCR) and traditional virus isolation on cell culture in detection of viral haemorrhagic septicaemia virus (VHSV) and infectious haematopoietic necrosis virus (IHNV). RT-PCR was used for 172 tissue sample pools (total of 859 fish) originating from a field survey on the occurrence of VHSV and IHNV in farmed and wild salmonids in Switzerland. These samples represented all sites with fish that were either identified as virus-positive by means of virus isolation (three sites, four positive tissue sample pools) and/or demonstrated positive anti-VHSV-antibody titres (83 sites, 121 positive blood samples) in a serum plaque neutralization test (SPNT). The RT-PCR technique confirmed the four VHSV-positive tissue sample pools detected by virus isolation and additionally identified one VHSV-positive sample that showed positive anti-VHSV-AB titres, but was negative in virus isolation. With IHNV, RT-PCR detected two positive samples not identified by virus isolation while in these fish the SPNT result had been questionable. One of the IHNV-positive samples represents the first detection of IHNV-RNA in wild brown trout in Switzerland. Compared to SPNT, the RT-PCR method detected, as with virus isolation, a much lower number of positive cases; reasons for this discrepancy are discussed. Our results indicate that RT-PCR can not only be successfully applied in field surveys, but may also be slightly more sensitive than virus isolation. However, in a titration experiment under laboratory conditions, the sensitivity of RT-PCR was not significantly higher when compared with virus isolation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Conventionally, endosseous dental implants have required 3 to 6 months of uninterrupted healing based on observations for dental implants that were characterized by a relatively smooth machined surface. Many studies have since demonstrated that implants with a roughened surface resulted in greater bone apposition, earlier bone contact, and a stronger bond between the implant and the bone, suggesting that implants with roughened surfaces could be loaded earlier than 3 to 6 months. Formal clinical studies confirmed that implants with rough surfaces can have abutments placed and be loaded occlusally as early as 6 weeks postplacement. The purpose of this prospective, human clinical investigation was to evaluate a large number of implants with a specific rough surface (sand-blasted acid-etched [SLA]) placed in everyday practice under routine private-practice conditions. METHODS: A prospective, multicenter, human clinical observational study was initiated with the goal of recruiting a minimum of 500 patients and 800 implants. The implants were to be placed and restored in predominantly private-practice settings around the world. Ninety-two practitioners in 16 countries agreed to participate, and 86 followed the study design. Patients had to be in good health, have sufficient bone to encase the implant, and agree to return for recall appointments. Exclusion criteria included heavy smoking (>10 cigarettes a day) and bone augmentation procedures at the implant site. All implants were two-piece (an abutment was to be placed after 6 weeks of healing) and were characterized by the presence of a transmucosal polished collar. Each implant had an SLA surface. All implants were positioned using a non-submerged (single-stage) surgical technique. Survival and success rates were calculated by life-table analyses. RESULTS: A total of 706 patients were enrolled and 1,406 implants were placed. In the final analyses, 590 patients with 990 implants (70.4% of those enrolled) met all inclusion criteria, including placement of an abutment and provisional restoration within 63 days of surgical placement. The majority of implants were 10 and 12 mm long (78.7%) and were placed in type II and III bone (87%). Seventy-three percent of the implants were placed in the mandible, and 27% were placed in the maxilla. The cumulative survival rate was 99.56% at 3 years and 99.26% at 5 years. The overall success rate was 99.12% at 3 years and 97.38% after 5 years. CONCLUSIONS: Under private-practice conditions, implants with an SLA surface could be placed and restored predictably within 6 to 8 weeks. Data from this prospective, multicenter, human observational study reinforced the results of more formal clinical studies and demonstrated that implants with the SLA surface can be restored in patients in approximately half of the time of conventional healing periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The literature on the applications of homeopathy for controlling plant diseases in both plant pathological models and field trials was first reviewed by Scofield in 1984. No other review on homeopathy in plant pathology has been published since, though much new research has subsequently been carried out using more advanced methods. Objectives: To conduct an up-to-date review of the existing literature on basic research in homeopathy using phytopathological models and experiments in the field. Methods: A literature search was carried out on publications from 1969 to 2009, for papers that reported experiments on homeopathy using phytopathological models (in vitro and in planta) and field trials. The selected papers were summarized and analysed on the basis of a Manuscript Information Score (MIS) to identify those that provided sufficient information for proper interpretation (MIS ≥ 5). These were then evaluated using a Study Methods Evaluation Procedure (SMEP). Results: A total of 44 publications on phytopathological models were identified: 19 papers with statistics, 6 studies with MIS ≥ 5. Publications on field were 9, 6 with MIS ≥ 5. In general, significant and reproducible effects with decimal and centesimal potencies were found, including dilution levels beyond the Avogadro's number. Conclusions: The prospects for homeopathic treatments in agriculture are promising, but much more experimentation is needed, especially at a field level, and on potentisation techniques, effective potency levels and conditions for reproducibility. Phytopathological models may also develop into useful tools to answer pharmaceutical questions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T2.33TC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High brightness electron sources are of great importance for the operation of the hard X-ray free electron lasers. Field emission cathodes based on the double-gate metallic field emitter arrays (FEAs) can potentially offer higher brightness than the currently used ones. We report on the successful application of electron beam lithography for fabrication of the large-scale single-gate as well as double-gate FEAs. We demonstrate operational high-density single-gate FEAs with sub-micron pitch and total number of tips up to 106 as well as large-scale double-gate FEAs with large collimation gate apertures. The details of design, fabrication procedure and successful measurements of the emission current from the single- and double-gate cathodes are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coarse semantic encoding and broad categorization behavior are the hallmarks of the right cerebral hemisphere's contribution to language processing. We correlated 40 healthy subjects' breadth of categorization as assessed with Pettigrew's category width scale with lateral asymmetries in perceptual and representational space. Specifically, we hypothesized broader category width to be associated with larger leftward spatial biases. For the 20 men, but not the 20 women, this hypothesis was confirmed both in a lateralized tachistoscopic task with chimeric faces and a random digit generation task; the higher a male participant's score on category width, the more pronounced were his left-visual field bias in the judgement of chimeric faces and his small-number preference in digit generation ("small" is to the left of "large" in number space). Subjects' category width was unrelated to lateral displacements in a blindfolded tactile-motor rod centering task. These findings indicate that visual-spatial functions of the right hemisphere should not be considered independent of the same hemisphere's contribution to language. Linguistic and spatial cognition may be more tightly interwoven than is currently assumed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Hodgkin lymphoma (HL) is a highly curable disease. Reducing late complications and second malignancies has become increasingly important. Radiotherapy target paradigms are currently changing and radiotherapy techniques are evolving rapidly. DESIGN This overview reports to what extent target volume reduction in involved-node (IN) and advanced radiotherapy techniques, such as intensity-modulated radiotherapy (IMRT) and proton therapy-compared with involved-field (IF) and 3D radiotherapy (3D-RT)- can reduce high doses to organs at risk (OAR) and examines the issues that still remain open. RESULTS Although no comparison of all available techniques on identical patient datasets exists, clear patterns emerge. Advanced dose-calculation algorithms (e.g., convolution-superposition/Monte Carlo) should be used in mediastinal HL. INRT consistently reduces treated volumes when compared with IFRT with the exact amount depending on the INRT definition. The number of patients that might significantly benefit from highly conformal techniques such as IMRT over 3D-RT regarding high-dose exposure to organs at risk (OAR) is smaller with INRT. The impact of larger volumes treated with low doses in advanced techniques is unclear. The type of IMRT used (static/rotational) is of minor importance. All advanced photon techniques result in similar potential benefits and disadvantages, therefore only the degree-of-modulation should be chosen based on individual treatment goals. Treatment in deep inspiration breath hold is being evaluated. Protons theoretically provide both excellent high-dose conformality and reduced integral dose. CONCLUSION Further reduction of treated volumes most effectively reduces OAR dose, most likely without disadvantages if the excellent control rates achieved currently are maintained. For both IFRT and INRT, the benefits of advanced radiotherapy techniques depend on the individual patient/target geometry. Their use should therefore be decided case by case with comparative treatment planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing evidence indicates that tumor microenvironment (TME) is crucial in tumor survival and metastases. Inflammatory cells accumulate around tumors and strangely appear to be permissive to their growth. One key stroma cell is the mast cell (MC), which can secrete numerous pro- and antitumor molecules. We investigated the presence and degranulation state of MC in pancreatic ductal adenocarcinoma (PDAC) as compared to acute ancreatitis (AP). Three different detection methods: (a) toluidine blue staining, as well as immunohistochemistry for (b) tryptase and (c) c-kit, were utilized to assess the number and extent of degranulation of MC in PDAC tissue (n=7), uninvolved pancreatic tissue derived from tumor-free margins (n=7) and tissue form AP (n=4). The number of MC detected with all three methods was significantly increased in PDAC, as compared to normal pancreatic tissue derived from tumor-free margins (p<0.05). The highest number of MC was identified by c-kit, 22.2∓7.5 per high power field (HPF) in PDAC vs 9.7∓5.1 per HPF in normal tissue. Contrary to MC in AP, where most of the detected MC were found degranulated, MC in PDAC appeared intact. In conclusion, MC are increased in number, but not degranulated in PDAC, suggesting that they may contribute to cancer growth by permitting selective release of pro-tumorogenic molecules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims at assessing the skill of several climate field reconstruction techniques (CFR) to reconstruct past precipitation over continental Europe and the Mediterranean at seasonal time scales over the last two millennia from proxy records. A number of pseudoproxy experiments are performed within the virtual reality ofa regional paleoclimate simulation at 45 km resolution to analyse different aspects of reconstruction skill. Canonical Correlation Analysis (CCA), two versions of an Analog Method (AM) and Bayesian hierarchical modeling (BHM) are applied to reconstruct precipitation from a synthetic network of pseudoproxies that are contaminated with various types of noise. The skill of the derived reconstructions is assessed through comparison with precipitation simulated by the regional climate model. Unlike BHM, CCA systematically underestimates the variance. The AM can be adjusted to overcome this shortcoming, presenting an intermediate behaviour between the two aforementioned techniques. However, a trade-off between reconstruction-target correlations and reconstructed variance is the drawback of all CFR techniques. CCA (BHM) presents the largest (lowest) skill in preserving the temporal evolution, whereas the AM can be tuned to reproduce better correlation at the expense of losing variance. While BHM has been shown to perform well for temperatures, it relies heavily on prescribed spatial correlation lengths. While this assumption is valid for temperature, it is hardly warranted for precipitation. In general, none of the methods outperforms the other. All experiments agree that a dense and regularly distributed proxy network is required to reconstruct precipitation accurately, reflecting its high spatial and temporal variability. This is especially true in summer, when a specifically short de-correlation distance from the proxy location is caused by localised summertime convective precipitation events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate numerically the effects of nozzle-exit flow conditions on the jet-flow development and the near-field sound at a diameter-based Reynolds number of Re D = 18 100 and Mach number Ma = 0.9. Our computational setup features the inclusion of a cylindrical nozzle which allows to establish a physical nozzle-exit flow and therefore well-defined initial jet-flow conditions. Within the nozzle, the flow is modeled by a potential flow core and a laminar, transitional, or developing turbulent boundary layer. The goal is to document and to compare the effects of the different jet inflows on the jet flow development and the sound radiation. For laminar and transitional boundary layers, transition to turbulence in the jet shear layer is governed by the development of Kelvin-Helmholtz instabilities. With the turbulent nozzle boundary layer, the jet flow development is characterized by a rapid changeover to a turbulent free shear layer within about one nozzle diameter. Sound pressure levels are strongly enhanced for laminar and transitional exit conditions compared to the turbulent case. However, a frequency and frequency-wavenumber analysis of the near-field pressure indicates that the dominant sound radiation characteristics remain largely unaffected. By applying a recently developed scaling procedure, we obtain a close match of the scaled near-field sound spectra for all nozzle-exit turbulence levels and also a reasonable agreement with experimental far-field data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades affine algebraic varieties and Stein manifolds with big (infinite-dimensional) automorphism groups have been intensively studied. Several notions expressing that the automorphisms group is big have been proposed. All of them imply that the manifold in question is an Oka–Forstnerič manifold. This important notion has also recently merged from the intensive studies around the homotopy principle in Complex Analysis. This homotopy principle, which goes back to the 1930s, has had an enormous impact on the development of the area of Several Complex Variables and the number of its applications is constantly growing. In this overview chapter we present three classes of properties: (1) density property, (2) flexibility, and (3) Oka–Forstnerič. For each class we give the relevant definitions, its most significant features and explain the known implications between all these properties. Many difficult mathematical problems could be solved by applying the developed theory, we indicate some of the most spectacular ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We tested the assumption that ego depletion would affect the sprint start in a sample of N = 38 athletes without track and field experience in an experiment by applying a mixed between- (depletion vs. non-depletion) within- (T1: before manipulation of ego depletion vs. T2: after manipulation of ego depletion) subjects design. We assumed that ego depletion would increase the possibility for a false start, as regulating the impulse to initiate the sprinting movement too soon before the starting signal requires self-control. In line with our assumption, we found a significant interaction as there was only a significant increase in the number of false starts from T1 to T2 for the depletion group while this was not the case for the non-depletion group. We conclude that ego depletion has a detrimental influence on the sprint start in athletes without track and field experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In laboratory experiments, people are willing to sanction norms at a cost—a behavioral tendency called altruistic punishment. However, the degree to which these findings can be generalized to real-world interactions is still debated. Only a small number of field experiments have been conducted, and initial results suggest that punishment is less frequent outside of the lab. This study replicates one of the first field experiments on altruistic punishment and builds ties to research on norm compliance and the broken windows theory. The original study addressed the enforcement of the anti-littering norm in Athens. We replicate this study in Bern, Zurich, and New York City. As an extension, we investigate how the experimental context (clean vs littered) impacts social norm enforcement. As a second extension, we investigate how opportunity structure impacts the maintenance of the anti-littering norm. Findings indicate that norms are universally enforced, although significantly less than in the standard laboratory experiment,and that enforcement is significantly more common in Switzerland than in New York. Moreover, individuals prefer more subtle forms of enforcement to direct punishment. We also find that enforcement is less frequent in littered than in clean contexts, suggesting that broken windows might not only foster deviant behavior but also weaken informal social control. Finally, we find that opportunity structure can encourage people to maintain norms, as indicated by the fact that people are more likely to voluntarily pick up litter when it is closer to a trash bin.