57 resultados para cloud-based computing
Resumo:
In this paper we present a new population-based implant design methodology, which advances the state-of-the-art approaches by combining shape and bone quality information into the design strategy. The method enhances the mechanical stability of the fixation and reduces the intra-operative in-plane bending which might impede the functionality of the locking mechanism. The method is presented for the case of mandibular locking fixation plates, where the mandibular angle and the bone quality at screw locations are taken into account. Using computational anatomy techniques, the method automatically derives, from a set of computed tomography images, the mandibular angle and the bone thickness and intensity values at the path of every screw. An optimisation strategy is then used to optimise the two parameters of plate angle and screw position. Results for the new design are presented along with a comparison with a commercially available mandibular locking fixation plate. A statistically highly significant improvement was observed. Our experiments allowed us to conclude that an angle of 126° and a screw separation of 8mm is a more suitable design than the standard 120° and 9mm.
Resumo:
Location-awareness indoors will be an inseparable feature of mobile services/applications in future wireless networks. Its current ubiquitous availability is still obstructed by technological challenges and privacy issues. We propose an innovative approach towards the concept of indoor positioning with main goal to develop a system that is self-learning and able to adapt to various radio propagation environments. The approach combines estimation of propagation conditions, subsequent appropriate channel modelling and optimisation feedback to the used positioning algorithm. Main advantages of the proposal are decreased system set-up effort, automatic re-calibration and increased precision.
Resumo:
A new physics-based technique for correcting inhomogeneities present in sub-daily temperature records is proposed. The approach accounts for changes in the sensor-shield characteristics that affect the energy balance dependent on ambient weather conditions (radiation, wind). An empirical model is formulated that reflects the main atmospheric processes and can be used in the correction step of a homogenization procedure. The model accounts for short- and long-wave radiation fluxes (including a snow cover component for albedo calculation) of a measurement system, such as a radiation shield. One part of the flux is further modulated by ventilation. The model requires only cloud cover and wind speed for each day, but detailed site-specific information is necessary. The final model has three free parameters, one of which is a constant offset. The three parameters can be determined, e.g., using the mean offsets for three observation times. The model is developed using the example of the change from the Wild screen to the Stevenson screen in the temperature record of Basel, Switzerland, in 1966. It is evaluated based on parallel measurements of both systems during a sub-period at this location, which were discovered during the writing of this paper. The model can be used in the correction step of homogenization to distribute a known mean step-size to every single measurement, thus providing a reasonable alternative correction procedure for high-resolution historical climate series. It also constitutes an error model, which may be applied, e.g., in data assimilation approaches.
Resumo:
This paper proposes a sequential coupling of a Hidden Markov Model (HMM) recognizer for offline handwritten English sentences with a probabilistic bottom-up chart parser using Stochastic Context-Free Grammars (SCFG) extracted from a text corpus. Based on extensive experiments, we conclude that syntax analysis helps to improve recognition rates significantly.
Resumo:
BACKGROUND: In industrialized countries vaccination coverage remains suboptimal, partly because of perception of an increased risk of asthma. Epidemiologic studies of the association between childhood vaccinations and asthma have provided conflicting results, possibly for methodologic reasons such as unreliable vaccination data, biased reporting, and reverse causation. A recent review stressed the need for additional, adequately controlled large-scale studies. OBJECTIVE: Our goal was to determine if routine childhood vaccination against pertussis was associated with subsequent development of childhood wheezing disorders and asthma in a large population-based cohort study. METHODS: In 6811 children from the general population born between 1993 and 1997 in Leicestershire, United Kingdom, respiratory symptom data from repeated questionnaire surveys up to 2003 were linked to independently collected vaccination data from the National Health Service database. We compared incident wheeze and asthma between children of different vaccination status (complete, partial, and no vaccination against pertussis) by computing hazard ratios. Analyses were based on 6048 children, 23 201 person-years of follow-up, and 2426 cases of new-onset wheeze. RESULTS: There was no evidence for an increased risk of wheeze or asthma in children vaccinated against pertussis compared with nonvaccinated children. Adjusted hazard ratios comparing fully and partially vaccinated with nonvaccinated children were close to one for both incident wheeze and asthma. CONCLUSION: This study provides no evidence of an association between vaccination against pertussis in infancy and an increased risk of later wheeze or asthma and does not support claims that vaccination against pertussis might significantly increase the risk of childhood asthma.
Resumo:
Background Access to health care can be described along four dimensions: geographic accessibility, availability, financial accessibility and acceptability. Geographic accessibility measures how physically accessible resources are for the population, while availability reflects what resources are available and in what amount. Combining these two types of measure into a single index provides a measure of geographic (or spatial) coverage, which is an important measure for assessing the degree of accessibility of a health care network. Results This paper describes the latest version of AccessMod, an extension to the Geographical Information System ArcView 3.×, and provides an example of application of this tool. AccessMod 3 allows one to compute geographic coverage to health care using terrain information and population distribution. Four major types of analysis are available in AccessMod: (1) modeling the coverage of catchment areas linked to an existing health facility network based on travel time, to provide a measure of physical accessibility to health care; (2) modeling geographic coverage according to the availability of services; (3) projecting the coverage of a scaling-up of an existing network; (4) providing information for cost effectiveness analysis when little information about the existing network is available. In addition to integrating travelling time, population distribution and the population coverage capacity specific to each health facility in the network, AccessMod can incorporate the influence of landscape components (e.g. topography, river and road networks, vegetation) that impact travelling time to and from facilities. Topographical constraints can be taken into account through an anisotropic analysis that considers the direction of movement. We provide an example of the application of AccessMod in the southern part of Malawi that shows the influences of the landscape constraints and of the modes of transportation on geographic coverage. Conclusion By incorporating the demand (population) and the supply (capacities of heath care centers), AccessMod provides a unifying tool to efficiently assess the geographic coverage of a network of health care facilities. This tool should be of particular interest to developing countries that have a relatively good geographic information on population distribution, terrain, and health facility locations.
Resumo:
Seasonal snow cover is of great environmental and socio-economic importance for the European Alps. Therefore a high priority has been assigned to quantifying its temporal and spatial variability. Complementary to land-based monitoring networks, optical satellite observations can be used to derive spatially comprehensive information on snow cover extent. For understanding long-term changes in alpine snow cover extent, the data acquired by the Advanced Very High Resolution Radiometer (AVHRR) sensors mounted onboard the National Oceanic and Atmospheric Association (NOAA) and Meteorological Operational satellite (MetOp) platforms offer a unique source of information. In this paper, we present the first space-borne 1 km snow extent climatology for the Alpine region derived from AVHRR data over the period 1985–2011. The objective of this study is twofold: first, to generate a new set of cloud-free satellite snow products using a specific cloud gap-filling technique and second, to examine the spatiotemporal distribution of snow cover in the European Alps over the last 27 yr from the satellite perspective. For this purpose, snow parameters such as snow onset day, snow cover duration (SCD), melt-out date and the snow cover area percentage (SCA) were employed to analyze spatiotemporal variability of snow cover over the course of three decades. On the regional scale, significant trends were found toward a shorter SCD at lower elevations in the south-east and south-west. However, our results do not show any significant trends in the monthly mean SCA over the last 27 yr. This is in agreement with other research findings and may indicate a deceleration of the decreasing snow trend in the Alpine region. Furthermore, such data may provide spatially and temporally homogeneous snow information for comprehensive use in related research fields (i.e., hydrologic and economic applications) or can serve as a reference for climate models.
Resumo:
Given a reproducing kernel Hilbert space (H,〈.,.〉)(H,〈.,.〉) of real-valued functions and a suitable measure μμ over the source space D⊂RD⊂R, we decompose HH as the sum of a subspace of centered functions for μμ and its orthogonal in HH. This decomposition leads to a special case of ANOVA kernels, for which the functional ANOVA representation of the best predictor can be elegantly derived, either in an interpolation or regularization framework. The proposed kernels appear to be particularly convenient for analyzing the effect of each (group of) variable(s) and computing sensitivity indices without recursivity.
Resumo:
Cost-efficient operation while satisfying performance and availability guarantees in Service Level Agreements (SLAs) is a challenge for Cloud Computing, as these are potentially conflicting objectives. We present a framework for SLA management based on multi-objective optimization. The framework features a forecasting model for determining the best virtual machine-to-host allocation given the need to minimize SLA violations, energy consumption and resource wasting. A comprehensive SLA management solution is proposed that uses event processing for monitoring and enables dynamic provisioning of virtual machines onto the physical infrastructure. We validated our implementation against serveral standard heuristics and were able to show that our approach is significantly better.
Resumo:
We study state-based video communication where a client simultaneously informs the server about the presence status of various packets in its buffer. In sender-driven transmission, the client periodically sends to the server a single acknowledgement packet that provides information about all packets that have arrived at the client by the time the acknowledgment is sent. In receiver-driven streaming, the client periodically sends to the server a single request packet that comprises a transmission schedule for sending missing data to the client over a horizon of time. We develop a comprehensive optimization framework that enables computing packet transmission decisions that maximize the end-to-end video quality for the given bandwidth resources, in both prospective scenarios. The core step of the optimization comprises computing the probability that a single packet will be communicated in error as a function of the expected transmission redundancy (or cost) used to communicate the packet. Through comprehensive simulation experiments, we carefully examine the performance advances that our framework enables relative to state-of-the-art scheduling systems that employ regular acknowledgement or request packets. Consistent gains in video quality of up to 2B are demonstrated across a variety of content types. We show that there is a direct analogy between the error-cost efficiency of streaming a single packet and the overall rate-distortion performance of streaming the whole content. In the case of sender-driven transmission, we develop an effective modeling approach that accurately characterizes the end-to-end performance as a function of the packet loss rate on the backward channel and the source encoding characteristics.
Resumo:
Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.