974 resultados para Count Basil
Resumo:
This paper explores the possibility of connecting two Wind Turbine Generators (WTG) to the grid using a single three level inverter. In the proposed system the rectified output of one WTG is connected across the upper dc-link capacitor of a standard diode clamped three level inverter. Similarly the rectified output of the other WTG is connected across the lower capacitor. This particular combination has several advantages such as, direct connection to the grid, reduced parts count, improved reliability and high power capacity. However, the major problem in the proposed system is the imminent imbalance of dc-link voltages. Under such conditions conventional modulation methods fail to produce desired voltage and current waveforms. A detailed analysis on this issue and a novel space vector modulation method, as the solution, are proposed in this paper. To track the Maximum power point of each WTG a power sharing algorithm is proposed. Simulation results are presented to attest the efficacy of the proposed system.
Resumo:
Lower airway inflammation is generally classified as eosinophilic or neutrophilic. In conditions where eosinophilic inflammation predominates such as asthma in children, corticosteroids are usually beneficial. Traditionally, lower airway eosinophilia is measured using cellular count (through bronchoalveolar lavage or induced sputum). Both methods have limited applicability in children. When instruments to measure fractional exhaled nitric oxide (FeNO) became available, it presented an attractive option as it provided a non-invasive method of measuring eosinophilic inflammation suitable for children and adult. Not surprisingly, proposals have been made that FeNO measurement can be clinically used in many scenarios including monitoring the response to anti-inflammatory medications, to verify the adherence to treatment, and to predict upcoming asthma exacerbations. This thesis addresses the utility of FeNO levels in various scenarios, specifically in relation to asthma control and cough, a contentious aspect of the diagnosis of asthma. The thesis consists of a series of systematic reviews (related to the main question) and original studies in children. The over-arching aim of the thesis is to determine if FeNO is a clinically useful tool in the management of asthma and common asthma symptoms. The specific aims of the thesis were, to: 1. Determine if children with asthma have more severe acute respiratory symptoms at presentation with an asthma exacerbation and at days 7, 10 and 14 using validated scales. We also examined if children with asthma were more likely to have a persistent cough on day 14 than children with protracted bronchitis and/or controls. 2. Evaluate the efficacy of tailoring asthma interventions based on sputum analysis in comparison to clinical symptoms (with or without spirometry/peak flow) for asthma related outcomes in children and adults. 3. Evaluate the efficacy of tailoring asthma interventions based on exhaled nitric oxide in comparison to clinical symptoms (with or without spirometry/peak flow) for asthma related outcomes in children and adults. 4. Determine if adjustment of asthma medications based on FeNO levels (compared to management based on clinical symptoms) reduces severe exacerbations in children with asthma. 5. Examine the relationship between FeNO and exercise induced broncho-constriction and cough in children The aims above are addressed in respective chapters and all but one has been published/submitted. A synopsis of the findings are: In study-1 (Aim 1), we found that children with protracted bronchitis had the most severe acute respiratory infection symptoms and higher percentage of respiratory morbidity at day 14 in comparison to children with asthma and healthy controls. The systematic review of study-2 (Aim 2) included 246 randomised adult participants (no children) with 221 completing the trials. In the meta-analysis, a significant reduction in number of participants who had one or more asthma exacerbations occurred when treatment was based on sputum eosinophils in comparison to clinical symptoms. In the systematic review of study-3 (Aim 3), we found no significant difference between the intervention group (treatment adjusted based on FeNO) and control group (treatment adjusted based on clinical symptoms) for the primary outcome of asthma exacerbations or for the other outcomes (clinical symptoms, FeNO level and spirometry). In post-hoc analysis, a significant reduction in mean final daily dose ICS per adult was found in the group where treatment was based on FeNO in comparison to clinical symptoms. In contrast, in the paediatric studies, there was a significant increase in ICS dose in the FeNO strategy arm. Thus, controversy remains of the benefit or otherwise of utilising exhaled nitric oxide (FeNO) in routine clinical practice. FeNO levels are dependent on atopy and none of the 7 published trials have considered atopic status in FeNO levels when medications were adjusted. In study-4 (Aim 4), 64 children with asthma were recruited. Their asthma medications were adjusted according to either FeNO levels or usual clinical care utilising a management hierarchy taking into account atopy. It was concluded that tailoring of asthma medications in accordance to FeNO levels (compared to usual management), taking into account atopy status, reduced the number of children with severe exacerbations. However, a FeNO-based strategy resulted in higher daily ICS doses and had no benefit on asthma control. In study-5 (Aim 5), 33 children with cough and 17 controls were recruited. They were randomised to undertake an exercise challenge on day 1, or dry powder mannitol challenge on day 1 (with alternative challenge being done on day 2). In addition, a 24 hour cough meter, skin prick test, capsaicin cough sensitivity test and cough diary were undertaken. The change in cough frequency post exercise was significantly increased in the children with cough. FeNO decreases post exercise regardless of whether EIB is present or not. Limitations in the studies were addressed in the respective chapters. In summary, the studies from this thesis have provided new information on: • The severity of respiratory symptoms was increased in the early phase of the asthma exacerbation but not in the later recovery phase when compared with controls. • The utility of FeNO in the management of children with asthma. • The relationship of FeNO, cough and EIB in children. • Systematic reviews on the efficacy of tailoring asthma interventions based on eosinophilic inflammatory markers (sputum analysis and FeNO) in comparison to clinical symptoms.
Resumo:
Sparse optical flow algorithms, such as the Lucas-Kanade approach, provide more robustness to noise than dense optical flow algorithms and are the preferred approach in many scenarios. Sparse optical flow algorithms estimate the displacement for a selected number of pixels in the image. These pixels can be chosen randomly. However, pixels in regions with more variance between the neighbours will produce more reliable displacement estimates. The selected pixel locations should therefore be chosen wisely. In this study, the suitability of Harris corners, Shi-Tomasi's “Good features to track", SIFT and SURF interest point extractors, Canny edges, and random pixel selection for the purpose of frame-by-frame tracking using a pyramidical Lucas-Kanade algorithm is investigated. The evaluation considers the important factors of processing time, feature count, and feature trackability in indoor and outdoor scenarios using ground vehicles and unmanned aerial vehicles, and for the purpose of visual odometry estimation.
Resumo:
Twitter is a very popular social network website that allows users to publish short posts called tweets. Users in Twitter can follow other users, called followees. A user can see the posts of his followees on his Twitter profile home page. An information overload problem arose, with the increase of the number of followees, related to the number of tweets available in the user page. Twitter, similar to other social network websites, attempts to elevate the tweets the user is expected to be interested in to increase overall user engagement. However, Twitter still uses the chronological order to rank the tweets. The tweets ranking problem was addressed in many current researches. A sub-problem of this problem is to rank the tweets for a single followee. In this paper we represent the tweets using several features and then we propose to use a weighted version of the famous voting system Borda-Count (BC) to combine several ranked lists into one. A gradient descent method and collaborative filtering method are employed to learn the optimal weights. We also employ the Baldwin voting system for blending features (or predictors). Finally we use the greedy feature selection algorithm to select the best combination of features to ensure the best results.
Resumo:
A novel burn wound hydrogel dressing has been previously developed which is composed of 2-acrylamido-2-methylpropane sulfonic acid sodium salt with silver nanoparticles (silver AMPS). This study compared the cytotoxicity of this dressing to the commercially available silver products; Acticoat™, PolyMem Silver® and Flamazine™ cream. Human keratinocytes (HaCaT and primary HEK) and normal human fibroblasts (NHF) were exposed to dressings incubated on Nunc™ polycarbonate inserts for 24, 48 and 72h. Four different cytotoxicity assays were performed including; Trypan Blue cell count, MTT, Celltiter-Blue™ and Toluidine Blue surface area assays. The results were expressed as relative cell viability compared to an untreated control. The cytotoxic effects of Acticoat™ and Flamazine™ cream were dependent on exposure time and cell type. After 24h exposure, Acticoat™ and Flamazine™ cream were toxic to all tested cell lines. Surprisingly, HaCaTs treated with Acticoat™ and Flamazine™ had an improved ability to survive at 48 and 72h while HEKs and NHFs had no improvement in survival with any treatment. The novel silver hydrogel and PolyMem Silver® showed low cytotoxicity to all tested cell lines at every time interval and these results support the possibility of using the novel silver hydrogel as a burn wound dressing. Researchers who rely on HaCaT cells as an accurate keratinocyte model should be aware that they can respond differently to primary skin cells.
Resumo:
Objectives Commercial sex is licensed in Victoria, Australia such that sex workers are required to have regular tests for sexually transmitted infections (STIs). However, the incidence and prevalence of STIs in sex workers are very low, especially since there is almost universal condom use at work. We aimed to conduct a cost-effectiveness analysis of the financial cost of the testing policy versus the health benefits of averting the transmission of HIV, syphilis, chlamydia and gonorrhoea to clients. Methods We developed a simple mathematical transmission model, informed by conservative parameter estimates from all available data, linked to a cost-effectiveness analysis. Results We estimated that under current testing rates, it costs over $A90 000 in screening costs for every chlamydia infection averted (and $A600 000 in screening costs for each quality-adjusted life year (QALY) saved) and over $A4 000 000 for every HIV infection averted ($A10 000 000 in screening costs for each QALY saved). At an assumed willingness to pay of $A50 000 per QALY gained, HIV testing should not be conducted less than approximately every 40 weeks and chlamydia testing approximately once per year; in comparison, current requirements are testing every 12 weeks for HIV and every 4 weeks for chlamydia. Conclusions Mandatory screening of female sex workers at current testing frequencies is not cost-effective for the prevention of disease in their male clients. The current testing rate required of sex workers in Victoria is excessive. Screening intervals for sex workers should be based on local STI epidemiology and not locked by legislation.
Resumo:
PURPOSE
The purposes of this study were to:
1) establish inter-instrument reliability between left and right hip accelerometer placement;
2) examine procedural reliability of a walking protocol used to measure physical activity (PA), and;
3) confirm concurrent validity of accelerometers in measuring PA intensity as compared to the gold standard of oxygen consumption measured by indirect calorimetry.
METHODS
Eight children (mean age: 11.9; SD: 3.2, 75% male) with CP (GMFCS levels I-III) wore ActiGraph GT3X accelerometers on each hip and the Cosmed K4b
Resumo:
In the commercial food industry, demonstration of microbiological safety and thermal process equivalence often involves a mathematical framework that assumes log-linear inactivation kinetics and invokes concepts of decimal reduction time (DT), z values, and accumulated lethality. However, many microbes, particularly spores, exhibit inactivation kinetics that are not log linear. This has led to alternative modeling approaches, such as the biphasic and Weibull models, that relax strong log-linear assumptions. Using a statistical framework, we developed a novel log-quadratic model, which approximates the biphasic and Weibull models and provides additional physiological interpretability. As a statistical linear model, the log-quadratic model is relatively simple to fit and straightforwardly provides confidence intervals for its fitted values. It allows a DT-like value to be derived, even from data that exhibit obvious "tailing." We also showed how existing models of non-log-linear microbial inactivation, such as the Weibull model, can fit into a statistical linear model framework that dramatically simplifies their solution. We applied the log-quadratic model to thermal inactivation data for the spore-forming bacterium Clostridium botulinum and evaluated its merits compared with those of popular previously described approaches. The log-quadratic model was used as the basis of a secondary model that can capture the dependence of microbial inactivation kinetics on temperature. This model, in turn, was linked to models of spore inactivation of Sapru et al. and Rodriguez et al. that posit different physiological states for spores within a population. We believe that the log-quadratic model provides a useful framework in which to test vitalistic and mechanistic hypotheses of inactivation by thermal and other processes. Copyright © 2009, American Society for Microbiology. All Rights Reserved.
Resumo:
Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.
Resumo:
A hippocampal-CA3 memory model was constructed with PGENESIS, a recently developed version of GENESIS that allows for distributed processing of a neural network simulation. A number of neural models of the human memory system have identified the CA3 region of the hippocampus as storing the declarative memory trace. However, computational models designed to assess the viability of the putative mechanisms of storage and retrieval have generally been too abstract to allow comparison with empirical data. Recent experimental evidence has shown that selective knock-out of NMDA receptors in the CA1 of mice leads to reduced stability of firing specificity in place cells. Here a similar reduction of stability of input specificity is demonstrated in a biologically plausible neural network model of the CA3 region, under conditions of Hebbian synaptic plasticity versus an absence of plasticity. The CA3 region is also commonly associated with seizure activity. Further simulations of the same model tested the response to continuously repeating versus randomized nonrepeating input patterns. Each paradigm delivered input of equal intensity and duration. Non-repeating input patterns elicited a greater pyramidal cell spike count. This suggests that repetitive versus non-repeating neocortical inpus has a quantitatively different effect on the hippocampus. This may be relevant to the production of independent epileptogenic zones and the process of encoding new memories.
Resumo:
Magnetic resonance is a well-established tool for structural characterisation of porous media. Features of pore-space morphology can be inferred from NMR diffusion-diffraction plots or the time-dependence of the apparent diffusion coefficient. Diffusion NMR signal attenuation can be computed from the restricted diffusion propagator, which describes the distribution of diffusing particles for a given starting position and diffusion time. We present two techniques for efficient evaluation of restricted diffusion propagators for use in NMR porous-media characterisation. The first is the Lattice Path Count (LPC). Its physical essence is that the restricted diffusion propagator connecting points A and B in time t is proportional to the number of distinct length-t paths from A to B. By using a discrete lattice, the number of such paths can be counted exactly. The second technique is the Markov transition matrix (MTM). The matrix represents the probabilities of jumps between every pair of lattice nodes within a single timestep. The propagator for an arbitrary diffusion time can be calculated as the appropriate matrix power. For periodic geometries, the transition matrix needs to be defined only for a single unit cell. This makes MTM ideally suited for periodic systems. Both LPC and MTM are closely related to existing computational techniques: LPC, to combinatorial techniques; and MTM, to the Fokker-Planck master equation. The relationship between LPC, MTM and other computational techniques is briefly discussed in the paper. Both LPC and MTM perform favourably compared to Monte Carlo sampling, yielding highly accurate and almost noiseless restricted diffusion propagators. Initial tests indicate that their computational performance is comparable to that of finite element methods. Both LPC and MTM can be applied to complicated pore-space geometries with no analytic solution. We discuss the new methods in the context of diffusion propagator calculation in porous materials and model biological tissues.
Resumo:
Due to their unobtrusive nature, vision-based approaches to tracking sports players have been preferred over wearable sensors as they do not require the players to be instrumented for each match. Unfortunately however, due to the heavy occlusion between players, variation in resolution and pose, in addition to fluctuating illumination conditions, tracking players continuously is still an unsolved vision problem. For tasks like clustering and retrieval, having noisy data (i.e. missing and false player detections) is problematic as it generates discontinuities in the input data stream. One method of circumventing this issue is to use an occupancy map, where the field is discretised into a series of zones and a count of player detections in each zone is obtained. A series of frames can then be concatenated to represent a set-play or example of team behaviour. A problem with this approach though is that the compressibility is low (i.e. the variability in the feature space is incredibly high). In this paper, we propose the use of a bilinear spatiotemporal basis model using a role representation to clean-up the noisy detections which operates in a low-dimensional space. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labeled data.
Resumo:
Those who teach film and media need to use screen content to illustrate their subjects. For example, students want illustrations to accompany lectures on film or television genres. Our experience has been that student access to the film and television screen content underpinning a study of genres is not only desirable but is, in fact, crucial for effective teaching and learning outcomes. Not so long ago, a screening during or at the completion of a lecture was the expected method by which educators delivered screen content to illustrate their teaching. Even if student attendance fluctuated from week to week a quick head count confirmed that a certain number of students were physically present. It was assumed that this physical attendance encouraged students to reflect upon and contextualize the material post lecture. While simply attending a lecture will not translate into actual student learning, it does demonstrate a willingness by students to engage with the course content by making a commitment to attend a scheduled and recurring lecture and screening program. However, as flipped classroom models gain acceptance in educational institutions, this traditional lecture-screening model is giving way to online, off-site, and student-controlled mechanisms for screen content delivery and viewing. Nevertheless, care should be taken when assessing how online delivery translates into student engagement and learning. As Junco (2012) points out, “it’s not the technology that generates learning, but the ways in which the technology are used.” Discussed, debated, and embraced to varying degrees by educators, there remains no definitive model for the flipped classroom – although many models involve ‘flipping’ content and knowledge acquisition (including viewing films and television shows) from scheduled on-campus classes to online material viewed by students in advance of an on-campus lecture or class. The classroom or tutorial room then becomes a space to problem-solve, engage in collaborate learning, and advance and explain concepts. From an institutional perspective, the flipped classroom model could deliver an additional benefit beyond immediate pedagogical concerns. Tucker (2012) suggests through the flipped classroom model “all aspects of instruction can be rethought to best maximize the scarcest learning resource — time.” The narrative most often associated with this shift is that the move to online content delivery of lecture and cinematic / televisual material may also provide educators with more time to do other work such as engage in research, plan strategies to empower students. Experimentation with the flipped classroom model is playing out in various educational institutions. Yet several core concerns remain — one of these concerns is the crucial question of whether an online/digital flipped approach is more effective for student engagement and learning than the traditional lecture-screening mode for screen content delivery. Some urge caution in this regard, arguing that “new technology isn’t always supported by change management and professional development to ensure that digital isn’t just a goal within itself, but actually helps to transform education” (Fleming cited in Blain 2014). The most fundamental concern remains how do lecturers, instructors, and tutors know students have watched the films and television shows associated with a subject? The remainder of this discussion deals with these concerns, and possible solutions offered, through an analysis of the Film, Television and Screen Genres subject at the Queensland University of Technology (QUT) in Brisbane, Queensland.
Resumo:
Although the external influence of scholars has usually been approximated by publication and citation count, the array of scholarly activities is far more extensive. Today, new technologies, in particular Internet search engines, allow more accurate measurement of scholars' influence on societal discourse. Hence, in this article, we analyse the relation between the internal and external influence of 723 top economists using the number of pages indexed by Google and Bing as a measure of external influence. We not only identify a small association between these scholars’ internal and external influence but also a correlation between internal influence, as captured by receipt of such major academic awards as the Nobel Prize and John Bates Clark Medal, and the external prominence of the top 100 researchers (JEL Code: A11, A13, Z18).