960 resultados para rough sets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern power systems have become more complex due to the growth in load demand, the installation of Flexible AC Transmission Systems (FACTS) devices and the integration of new HVDC links into existing AC grids. On the other hand, the introduction of the deregulated and unbundled power market operational mechanism, together with present changes in generation sources including connections of large renewable energy generation with intermittent feature in nature, have further increased the complexity and uncertainty for power system operation and control. System operators and engineers have to confront a series of technical challenges from the operation of currently interconnected power systems. Among the many challenges, how to evaluate the steady state and dynamic behaviors of existing interconnected power systems effectively and accurately using more powerful computational analysis models and approaches becomes one of the key issues in power engineering. The traditional computing techniques have been widely used in various fields for power system analysis with varying degrees of success. The rapid development of computational intelligence, such as neural networks, fuzzy systems and evolutionary computation, provides tools and opportunities to solve the complex technical problems in power system planning, operation and control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapid advances in sequencing technologies (Next Generation Sequencing or NGS) have led to a vast increase in the quantity of bioinformatics data available, with this increasing scale presenting enormous challenges to researchers seeking to identify complex interactions. This paper is concerned with the domain of transcriptional regulation, and the use of visualisation to identify relationships between specific regulatory proteins (the transcription factors or TFs) and their associated target genes (TGs). We present preliminary work from an ongoing study which aims to determine the effectiveness of different visual representations and large scale displays in supporting discovery. Following an iterative process of implementation and evaluation, representations were tested by potential users in the bioinformatics domain to determine their efficacy, and to understand better the range of ad hoc practices among bioinformatics literate users. Results from two rounds of small scale user studies are considered with initial findings suggesting that bioinformaticians require richly detailed views of TF data, features to compare TF layouts between organisms quickly, and ways to keep track of interesting data points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling strategies are developed based on the idea of ranked set sampling (RSS) to increase efficiency and therefore to reduce the cost of sampling in fishery research. The RSS incorporates information on concomitant variables that are correlated with the variable of interest in the selection of samples. For example, estimating a monitoring survey abundance index would be more efficient if the sampling sites were selected based on the information from previous surveys or catch rates of the fishery. We use two practical fishery examples to demonstrate the approach: site selection for a fishery-independent monitoring survey in the Australian northern prawn fishery (NPF) and fish age prediction by simple linear regression modelling a short-lived tropical clupeoid. The relative efficiencies of the new designs were derived analytically and compared with the traditional simple random sampling (SRS). Optimal sampling schemes were measured by different optimality criteria. For the NPF monitoring survey, the efficiency in terms of variance or mean squared errors of the estimated mean abundance index ranged from 114 to 199% compared with the SRS. In the case of a fish ageing study for Tenualosa ilisha in Bangladesh, the efficiency of age prediction from fish body weight reached 140%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The specific activity and content of cytochrome oxidase in the rough endoplasmic reticulum--mitochondrion complex are higher than in the mitochondrial fraction. Radiolabelling studies with the use of hepatocytes and isolated microsomal and rough endoplasmic reticulum--mitochondrion fractions, followed by immunoprecipitation with anti-(cytochrome oxidase) antibody, reveal that the nuclear-coded cytoplasmic subunits of cytochrome oxidase are preferentially synthesized in the latter fraction. The results have a bearing on the mechanism of transport of these subunits into mitochondria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Plotless density estimators are those that are based on distance measures rather than counts per unit area (quadrats or plots) to estimate the density of some usually stationary event, e.g. burrow openings, damage to plant stems, etc. These estimators typically use distance measures between events and from random points to events to derive an estimate of density. The error and bias of these estimators for the various spatial patterns found in nature have been examined using simulated populations only. In this study we investigated eight plotless density estimators to determine which were robust across a wide range of data sets from fully mapped field sites. They covered a wide range of situations including animal damage to rice and corn, nest locations, active rodent burrows and distribution of plants. Monte Carlo simulations were applied to sample the data sets, and in all cases the error of the estimate (measured as relative root mean square error) was reduced with increasing sample size. The method of calculation and ease of use in the field were also used to judge the usefulness of the estimator. Estimators were evaluated in their original published forms, although the variable area transect (VAT) and ordered distance methods have been the subjects of optimization studies. Results: An estimator that was a compound of three basic distance estimators was found to be robust across all spatial patterns for sample sizes of 25 or greater. The same field methodology can be used either with the basic distance formula or the formula used with the Kendall-Moran estimator in which case a reduction in error may be gained for sample sizes less than 25, however, there is no improvement for larger sample sizes. The variable area transect (VAT) method performed moderately well, is easy to use in the field, and its calculations easy to undertake. Conclusion: Plotless density estimators can provide an estimate of density in situations where it would not be practical to layout a plot or quadrat and can in many cases reduce the workload in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typhoid fever is becoming an ever increasing threat in the developing countries. We have improved considerably upon the existing PCR-based diagnosis method by designing primers against a region that is unique to Salmonella enterica subsp. enterica serovar Typhi and Salmonella enterica subsp. enterica serovar Paratyphi A, corresponding to the STY0312 gene in S. Typhi and its homolog SPA2476 in S. Paratyphi A. An additional set of primers amplify another region in S. Typhi CT18 and S. Typhi Ty2 corresponding to the region between genes STY0313 to STY0316 but which is absent in S. Paratyphi A. The possibility of a false-negative result arising due to mutation in hypervariable genes has been reduced by targeting a gene unique to typhoidal Salmonella serovars as a diagnostic marker. The amplified region has been tested for genomic stability by amplifying the region from clinical isolates of patients from various geographical locations in India, thereby showing that this region is potentially stable. These set of primers can also differentiate between S. Typhi CT18, S. Typhi Ty2, and S. Paratyphi A, which have stable deletions in this specific locus. The PCR assay designed in this study has a sensitivity of 95% compared to the Widal test which has a sensitivity of only 63%. As observed, in certain cases, the PCR assay was more sensitive than the blood culture test was, as the PCR-based detection could also detect dead bacteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extreme vibration has been reported for small, high speed craft in the maritime sector, with performance and health threatening effects on boat operators and crew. Musculoskeletal injuries are an enduring problem for high speed craft passengers. Spinal or joint injuries and neurological disorders may occur from repetitive pounding over rough water, continued vibration and single impact events. The risk from whole body vibration (WBV) induced through the small vessels mainly depends on time spent on the craft, which can’t be changed in a military scenario; as well as the number of shocks and jolts, and their magnitude and frequency. In the European Union for example, physical agents directives require all employers to control exposure to a number of physical agents including noise and vibration. The EC Vibration Directive 2002/44/EC then sets out regulations for the control of health and safety risks from the exposure of workers to hand arm vibration (HAV) and WBV in the workplace. Australia has exposure standards relating to WBV, AS 2670.1-2001 – Evaluation of human exposure to whole body vibration. This standard is identical to the ISO 2631-1:1997, Mechanical vibration and shock – Evaluation of human exposure to whole-body vibration. Currently, none of the jurisdictions in Australia have specific regulations for vibration exposures in workplaces. However vibration is mentioned to varying degrees in their general regulations, codes of practice and guidance material. WBV on high speed craft is normally caused by “continuous 'hammering' from short steep seas or wind against tide conditions. Shock on High Speed Craft is usually caused by random impacts. Military organisations need the knowledge to make informed decisions regarding their marine operations, compliance with legislation and potentially harmful health effects, and develop and implement appropriate counter-measures. Marine case studies in the UK such as published MAIB (Marine Accident Investigation Branch) reports show injuries that have occurred in operation, and subsequent MCA (Maritime Coastguard Agency) guidance is provided (MGN 436 (M+F), WHOLE-BODY VIBRATION: Guidance on Mitigating Against the Effects of Shocks and Impacts on Small Vessels. MCA, 2011). This paper proposes a research framework to study the origin, impact and pathways for prevention of WBV in small, high speed craft in a maritime environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of this dissertation lies in the intersection of harmonic analysis and fractal geometry. We particulary consider singular integrals in Euclidean spaces with respect to general measures, and we study how the geometric structure of the measures affects certain analytic properties of the operators. The thesis consists of three research articles and an overview. In the first article we construct singular integral operators on lower dimensional Sierpinski gaskets associated with homogeneous Calderón-Zygmund kernels. While these operators are bounded their principal values fail to exist almost everywhere. Conformal iterated function systems generate a broad range of fractal sets. In the second article we prove that many of these limit sets are porous in a very strong sense, by showing that they contain holes spread in every direction. In the following we connect these results with singular integrals. We exploit the fractal structure of these limit sets, in order to establish that singular integrals associated with very general kernels converge weakly. Boundedness questions consist a central topic of investigation in the theory of singular integrals. In the third article we study singular integrals of different measures. We prove a very general boundedness result in the case where the two underlying measures are separated by a Lipshitz graph. As a consequence we show that a certain weak convergence holds for a large class of singular integrals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we study a few games related to non-wellfounded and stationary sets. Games have turned out to be an important tool in mathematical logic ranging from semantic games defining the truth of a sentence in a given logic to for example games on real numbers whose determinacies have important effects on the consistency of certain large cardinal assumptions. The equality of non-wellfounded sets can be determined by a so called bisimulation game already used to identify processes in theoretical computer science and possible world models for modal logic. Here we present a game to classify non-wellfounded sets according to their branching structure. We also study games on stationary sets moving back to classical wellfounded set theory. We also describe a way to approximate non-wellfounded sets with hereditarily finite wellfounded sets. The framework used to do this is domain theory. In the Banach-Mazur game, also called the ideal game, the players play a descending sequence of stationary sets and the second player tries to keep their intersection stationary. The game is connected to precipitousness of the corresponding ideal. In the pressing down game first player plays regressive functions defined on stationary sets and the second player responds with a stationary set where the function is constant trying to keep the intersection stationary. This game has applications in model theory to the determinacy of the Ehrenfeucht-Fraisse game. We show that it is consistent that these games are not equivalent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bearing area analysis has been used to study the real area of contact and compliance of rough turned steel cylinders in compression. Calculations show that the elastic real area of contact is very small compared to the plastic real area of contact, and that local compliance due to flattening of asperity tips is a small proportion of the total compliance obtained from experiments. The fact that increased load brings more and more new asperities under load rather than enlarging the contact spots leads to a rather simple load-compliance relation for a rough cylinder, viz., W' = Nh · K1δn, where W0 = K1δn defines the load-compliance relation of the individual asperities, and Nh represents the number of asperities bearing the load.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bearing area analysis has been used to study the real area of contact and compliance of rough turned steel cylinders in compression. Calculations show that the elastic real area of contact is very small compared to the plastic real area of contact, and that local compliance due to flattening of asperity tips is a small proportion of the total compliance obtained from experiments. The fact that increased load brings more and more new asperities under load rather than enlarging the contact spots leads to a rather simple load-compliance relation for a rough cylinder, viz., W' = Nh · K1δn, where W0 = K1δn defines the load-compliance relation of the individual asperities, and Nh represents the number of asperities bearing the load.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lipid A and lipopolysaccharide (LPS) binding and neutralizing activities of a synthetic, polycationic, amphiphilic peptide were studied. The branched peptide, designed as a functional analog of polymyxin B, has a six residue hydrophobic sequence, bearing at its N-terminus a penultimate lysine residue whose alpha- and epsilon-amino groups are coupled to two terminal lysine residues. In fluorescence spectroscopic studies designed to examine relative affinities of binding to the toxin, neutralization of surface charge and fluidization of the acyl domains, the peptide was active, closely resembling the effects of polymyxin B and its nonapeptide derivative; however, the synthetic peptide does not induce phase transitions in LPS aggregates as do polymyxin B and polymyxin B nonapeptide. The peptide was also comparable with polymyxin B in its ability to inhibit LPS-mediated IL-l and IL-6 release from human peripheral blood mononuclear cells. The synthetic compound is devoid of antibacterial activities and did not induce conductance fluxes in LPS-containing asymmetric planar membranes. These results strengthen the premise that basicity and amphiphilicity are necessary and sufficient physical properties that ascribe endotoxin binding and neutralizing activities, and further suggest that antibacterial/membrane perturbant and LPS neutralizing activities are dissociable, which may be of value in designing LPS-sequestering agents of low toxicity.