943 resultados para Platelet Count


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a flying-capacitor-based chopper circuit for dc capacitor voltage equalization in diode-clamped multilevel inverters. Its important features are reduced voltage stress across the chopper switches, possible reduction in the chopper switching frequency, improved reliability, and ride-through capability enhancement. This topology is analyzed using three- and four-level flying-capacitor-based chopper circuit configurations. These configurations are different in capacitor and semiconductor device count and correspondingly reduce the device voltage stresses by half and one-third, respectively. The detailed working principles and control schemes for these circuits are presented. It is shown that, by preferentially selecting the available chopper switch states, the dc-link capacitor voltages can be efficiently equalized in addition to having tightly regulated flying-capacitor voltages around their references. The various operating modes of the chopper are described along with their preferential selection logic to achieve the desired performances. The performance of the proposed chopper and corresponding control schemes are confirmed through both simulation and experimental investigations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most costly operations encountered in pairing computations are those that take place in the full extension field Fpk . At high levels of security, the complexity of operations in Fpk dominates the complexity of the operations that occur in the lower degree subfields. Consequently, full extension field operations have the greatest effect on the runtime of Miller’s algorithm. Many recent optimizations in the literature have focussed on improving the overall operation count by presenting new explicit formulas that reduce the number of subfield operations encountered throughout an iteration of Miller’s algorithm. Unfortunately, almost all of these improvements tend to suffer for larger embedding degrees where the expensive extension field operations far outweigh the operations in the smaller subfields. In this paper, we propose a new way of carrying out Miller’s algorithm that involves new explicit formulas which reduce the number of full extension field operations that occur in an iteration of the Miller loop, resulting in significant speed ups in most practical situations of between 5 and 30 percent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ROLE OF LOW AFFINITY β1-ADRENERGIC RECEPTOR IN NORMAL AND DISEASED HEARTS Background: The β1-adrenergic receptor (AR) has at least two binding sites, 1HAR and 1LAR (high and low affinity site of the 1AR respectively) which cause cardiostimulation. Some β-blockers, for example (-)-pindolol and (-)-CGP 12177 can activate β1LAR at higher concentrations than those required to block β1HAR. While β1HAR can be blocked by all clinically used β-blockers, β1LAR is relatively resistant to blockade. Thus, chronic β1LAR activation may occur in the setting of β-blocker therapy, thereby mediating persistent βAR signaling. Thus, it is important to determine the potential significance of β1LAR in vivo, particularly in disease settings. Method and result: C57Bl/6 male mice were used. Chronic (4 weeks) β1LAR activation was achieved by treatment with (-)-CGP12177 via osmotic minipump. Cardiac function was assessed by echocardiography and catheterization. (-)-CGP12177 treatment in healthy mice increased heart rate and left ventricular (LV) contractility without detectable LV remodelling or hypertrophy. In mice subjected to an 8-week period of aorta banding, (-)-CGP12177 treatment given during 4-8 weeks led to a positive inotropic effect. (-)-CGP12177 treatment exacerbated LV remodelling indicated by a worsening of LV hypertrophy by ??% (estimated by weight, wall thickness, cardiomyocyte size) and interstitial/perivascular fibrosis (by histology). Importantly, (-)-CGP12177 treatment to aorta banded mice exacerbated cardiac expression of hypertrophic, fibrogenic and inflammatory genes (all p<0.05 vs. non-treated control with aorta banding).. Conclusion: β1LAR activation provides functional support to the heart, in both normal and diseased (pressure overload) settings. Sustained β1LAR activation in the diseased heart exacerbates LV remodelling and therefore may promote disease progression from compensatory hypertrophy to heart failure. Word count: 270

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In public venues, crowd size is a key indicator of crowd safety and stability. In this paper we propose a crowd counting algorithm that uses tracking and local features to count the number of people in each group as represented by a foreground blob segment, so that the total crowd estimate is the sum of the group sizes. Tracking is employed to improve the robustness of the estimate, by analysing the history of each group, including splitting and merging events. A simplified ground truth annotation strategy results in an approach with minimal setup requirements that is highly accurate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a problematisation of the teaching of art to young children. To problematise a domain of social endeavour, is, in Michel Foucault's terms, to ask how we come to believe that "something ... can and must be thought" (Foucault, 1985:7). The aim is to document what counts (i.e., what is sayable, thinkable, feelable) as proper art teaching in Queensland at this point ofhistorical time. In this sense, the thesis is a departure from more recognisable research on 'more effective' teaching, including critical studies of art teaching and early childhood teaching. It treats 'good teaching' as an effect of moral training made possible through disciplinary discourses organised around certain epistemic rules at a particular place and time. There are four key tasks accomplished within the thesis. The first is to describe an event which is not easily resolved by means of orthodox theories or explanations, either liberal-humanist or critical ones. The second is to indicate how poststructuralist understandings of the self and social practice enable fresh engagements with uneasy pedagogical moments. What follows this discussion is the documentation of an empirical investigation that was made into texts generated by early childhood teachers, artists and parents about what constitutes 'good practice' in art teaching. Twenty-two participants produced text to tell and re-tell the meaning of 'proper' art education, from different subject positions. Rather than attempting to capture 'typical' representations of art education in the early years, a pool of 'exemplary' teachers, artists and parents were chosen, using "purposeful sampling", and from this pool, three videos were filmed and later discussed by the audience of participants. The fourth aspect of the thesis involves developing a means of analysing these texts in such a way as to allow a 're-description' of the field of art teaching by attempting to foreground the epistemic rules through which such teacher-generated texts come to count as true ie, as propriety in art pedagogy. This analysis drew on Donna Haraway's (1995) understanding of 'ironic' categorisation to hold the tensions within the propositions inside the categories of analysis rather than setting these up as discursive oppositions. The analysis is therefore ironic in the sense that Richard Rorty (1989) understands the term to apply to social scientific research. Three 'ironic' categories were argued to inform the discursive construction of 'proper' art teaching. It is argued that a teacher should (a) Teach without teaching; (b) Manufacture the natural; and (c) Train for creativity. These ironic categories work to undo modernist assumptions about theory/practice gaps and finding a 'balance' between oppositional binary terms. They were produced through a discourse theoretical reading of the texts generated by the participants in the study, texts that these same individuals use as a means of discipline and self-training as they work to teach properly. In arguing the usefulness of such approaches to empirical data analysis, the thesis challenges early childhood research in arts education, in relation to its capacity to deal with ambiguity and to acknowledge contradiction in the work of teachers and in their explanations for what they do. It works as a challenge at a range of levels - at the level of theorising, of method and of analysis. In opening up thinking about normalised categories, and questioning traditional Western philosophy and the grand narratives of early childhood art pedagogy, it makes a space for re-thinking art pedagogy as "a game oftruth and error" (Foucault, 1985). In doing so, it opens up a space for thinking how art education might be otherwise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Earlier studies have shown that the influence of fixation stability on bone healing diminishes with advanced age. The goal of this study was to unravel the relationship between mechanical stimulus and age on callus competence at a tissue level. Using 3D in vitro micro-computed tomography derived metrics, 2D in vivo radiography, and histology, we investigated the influences of age and varying fixation stability on callus size, geometry, microstructure, composition, remodeling, and vascularity. Compared were four groups with a 1.5-mm osteotomy gap in the femora of Sprague–Dawley rats: Young rigid (YR), Young semirigid (YSR), Old rigid (OR), Old semirigid (OSR). Hypothesis was that calcified callus microstructure and composition is impaired due to the influence of advanced age, and these individuals would show a reduced response to fixation stabilities. Semirigid fixations resulted in a larger ΔCSA (Callus cross-sectional area) compared to rigid groups. In vitro μCT analysis at 6 weeks postmortem showed callus bridging scores in younger animals to be superior than their older counterparts (pb0.01). Younger animals showed (i) larger callus strut thickness (pb0.001), (ii) lower perforation in struts (pb0.01), and (iii) higher mineralization of callus struts (pb0.001). Callus mineralization was reduced in young animals with semirigid fracture fixation but remained unaffected in the aged group. While stability had an influence, age showed none on callus size and geometry of callus. With no differences observed in relative osteoid areas in the callus ROI, old as well as semirigid fixated animals showed a higher osteoclast count (pb0.05). Blood vessel density was reduced in animals with semirigid fixation (pb0.05). In conclusion, in vivo monitoring indicated delayed callus maturation in aged individuals. Callus bridging and callus competence (microstructure and mineralization) were impaired in individuals with an advanced age. This matched with increased bone resorption due to higher osteoclast numbers. Varying fixator configurations in older individuals did not alter the dominant effect of advanced age on callus tissue mineralization, unlike in their younger counterparts. Age-associated influences appeared independent from stability. This study illustrates the dominating role of osteoclastic activity in age-related impaired healing, while demonstrating the optimization of fixation parameters such as stiffness appeared to be less effective in influencing healing in aged individuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Now in its second edition, this book describes tools that are commonly used in transportation data analysis. The first part of the text provides statistical fundamentals while the second part presents continuous dependent variable models. With a focus on count and discrete dependent variable models, the third part features new chapters on mixed logit models, logistic regression, and ordered probability models. The last section provides additional coverage of Bayesian statistical modeling, including Bayesian inference and Markov chain Monte Carlo methods. Data sets are available online to use with the modeling techniques discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large trucks are involved in a disproportionately small fraction of the total crashes but a disproportionately large fraction of fatal crashes. Large truck crashes often result in significant congestion due to their large physical dimensions and from difficulties in clearing crash scenes. Consequently, preventing large truck crashes is critical to improving highway safety and operations. This study identifies high risk sites (hot spots) for large truck crashes in Arizona and examines potential risk factors related to the design and operation of the high risk sites. High risk sites were identified using both state of the practice methods (accident reduction potential using negative binomial regression with long crash histories) and a newly proposed method using Property Damage Only Equivalents (PDOE). The hot spots identified via the count model generally exhibited low fatalities and major injuries but large minor injuries and PDOs, while the opposite trend was observed using the PDOE methodology. The hot spots based on the count model exhibited large AADTs, whereas those based on the PDOE showed relatively small AADTs but large fractions of trucks and high posted speed limits. Documented site investigations of hot spots revealed numerous potential risk factors, including weaving activities near freeway junctions and ramps, absence of acceleration lanes near on-ramps, small shoulders to accommodate large trucks, narrow lane widths, inadequate signage, and poor lighting conditions within a tunnel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Acute coronary syndromes are a major cause of mortality and morbidity. Objectives/Methods: The objective of this evaluation is to review the clinical trials of two new drugs being developed for the treatment of acute coronary syndromes. The first drug is the anti-coagulant otamixaban, and the trial compared otamixaban with unfractionated heparin and eptifibatide in acute coronary syndromes. The second drug is the anti-platelet ticagrelor, and the trial compared ticagrelor with clopidogrel in acute coronary syndromes. Results: In the SEPIA-ACS1 TIMI 42 trial, the primary efficacy endpoint occurred in 6.2% of subjects treated with unfractionated heparin and eptifibatide, and to a significantly lesser extent with otamixaban. In the PLATO trial, the primary efficacy endpoint had occurred less in the ticagrelor group (9.8%) than in the clopidogrel group (11.7%) at 12 months. Conclusions: Two new drugs for acute coronary syndromes, otamixaban and ticagrelor, have recently been shown to have benefits in subjects undergoing percutaneous interventions compared to the present standard regimens for this condition.