196 resultados para leukocyte count
Resumo:
Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.
Resumo:
Detecting query reformulations within a session by a Web searcher is an important area of research for designing more helpful searching systems and targeting content to particular users. Methods explored by other researchers include both qualitative (i.e., the use of human judges to manually analyze query patterns on usually small samples) and nondeterministic algorithms, typically using large amounts of training data to predict query modification during sessions. In this article, we explore three alternative methods for detection of session boundaries. All three methods are computationally straightforward and therefore easily implemented for detection of session changes. We examine 2,465,145 interactions from 534,507 users of Dogpile.com on May 6, 2005. We compare session analysis using (a) Internet Protocol address and cookie; (b) Internet Protocol address, cookie, and a temporal limit on intrasession interactions; and (c) Internet Protocol address, cookie, and query reformulation patterns. Overall, our analysis shows that defining sessions by query reformulation along with Internet Protocol address and cookie provides the best measure, resulting in an 82% increase in the count of sessions. Regardless of the method used, the mean session length was fewer than three queries, and the mean session duration was less than 30 min. Searchers most often modified their query by changing query terms (nearly 23% of all query modifications) rather than adding or deleting terms. Implications are that for measuring searching traffic, unique sessions may be a better indicator than the common metric of unique visitors. This research also sheds light on the more complex aspects of Web searching involving query modifications and may lead to advances in searching tools.
Resumo:
The case proposes an ethical dilemma that a Public Service Director faces that could affect his career, the career of his boss, and the career of the governor of a state. There is a strong need for ethical leaders in this changing global organization world where the headlines are filled with stories of private sector and public sector leaders who have made serious ethical and moral compromises. It is easy to follow ethical leaders who you can count on to do what is right and difficult to follow those who will do what is expedient or personally beneficial. However, ethical leadership is not always black and white as this case will portray. Difficult decisions must be made where it may not always be clear what to do. The names in the case have been changed although the situation is a real one.
Resumo:
The introduction of the Australian curriculum, the use of standardised testing (e.g. NAPLAN) and the My School website are couched in a context of accountability. This circumstance has stimulated and in some cases renewed a range of boundaries in Australian Education. The consequences that arise from standardised testing have accentuated the boundaries produced by social reproduction in education which has led to an increase in the numbers of students disengaging from mainstream education and applying for enrolment at the Edmund Rice Education Australia Flexible Learning Centre Network (EREAFLCN). Boundaries are created for many young people who are denied access to credentials and certification as a result of being excluded from or in some way disengaging from standardised education and testing. Young people who participate at the EREAFLCN arrive with a variety of forms of cultural capital that are not valued in current education and employment fields. This is not to say that these young people’s different forms of cultural capital have no value, but rather that such funds of knowledge, repertoires and cultural capital are not valued by the majority of powerful agents in educational and employment fields. How then can the qualitative value of traditionally unorthodox - yet often intricate, ingenious, and astute - versions of cultural capital evident in the habitus of many young people be made to count, be recognised, be valuated? Can a process of educational assessment be a field of capital exchange and a space which breaches boundaries through a valuating process? This paper reports on the development of an innovative approach to assessment in an alternative education institution designed for the re-engagement of ‘at risk’ youth who have left formal schooling. A case study approach has been used to document the engagement of six young people, with an educational approach described as assessment for learning as a field of exchange across two sites in the EREAFLCN. In order to capture the broad range of students’ cultural and social capital, an electronic portfolio system (EPS) is under trial. The model draws on categories from sociological models of capital and reconceptualises the eportfolio as a sociocultural zone of learning and development. Results from the trial show a general tendency towards engagement with the EPS and potential for the attainment of socially valued cultural capital in the form of school credentials. In this way restrictive boundaries can be breached and a more equitable outcome achieved for many young Australians.
Resumo:
The introduction of the Australian curriculum, the use of standardised testing (e.g. NAPLAN) and the My School website have stimulated and in some cases renewed a range of boundaries for young people in Australian Education. Standardised testing has accentuated social reproduction in education with an increase in the numbers of students disengaging from mainstream education and applying for enrolment at the Edmund Rice Education Australia Flexible Learning Centre Network (EREAFLCN). Many young people are denied access to credentials and certification as they become excluded from standardised education and testing. The creativity and skills of marginalised youth are often evidence of general capabilities and yet do not appear to be recognised in mainstream educational institutions when standardised approaches are adopted. Young people who participate at the EREAFLCN arrive with a variety of forms of cultural capital, frequently utilising general capabilities, which are not able to be valued in current education and employment fields. This is not to say that these young people‟s different forms of cultural capital have no value, but rather that such funds of knowledge, repertoires and cultural capital are not valued by the majority of powerful agents in educational and employment fields. How then can the inherent value of traditionally unorthodox - yet often intricate, ingenious, and astute-versions of cultural capital evident in the habitus of many young people be made to count, be recognised, be valuated?Can a process of educational assessment be a field of capital exchange and a space which crosses boundaries through a valuating process? This paper reports on the development of an innovative approach to assessment in an alternative education institution designed for the re engagement of „at risk‟ youth who have left formal schooling. A case study approach has been used to document the engagement of six young people, with an educational approach described as assessment for learning as a field of exchange across two sites in the EREAFLCN. In order to capture the broad range of students‟ cultural and social capital, an electronic portfolio system (EPS) is under trial. The model draws on categories from sociological models of capital and reconceptualises the eportfolio as a sociocultural zone of learning and development. Results from the trial show a general tendency towards engagement with the EPS and potential for the attainment of socially valued cultural capital in the form of school credentials. In this way restrictive boundaries can be breached and a more equitable outcome achieved for many young Australians.
Resumo:
Standardised testing does not recognise the creativity and skills of marginalised youth. Young people who come to the Edmund Rice Education Australia Flexible Learning Centre Network (EREAFLCN) in Australia arrive with forms of cultural capital that are not valued in the field of education and employment. This is not to say that young people‟s different modes of cultural capital have no value, but rather that such funds of knowledge, repertoires and cultural capital are not valued by the powerful agents in educational and employment fields. The forms of cultural capital which are valued by these institutions are measurable in certain structured formats which are largely inaccessible for what is seen in Australia to be a growing segment of the community. How then can the inherent value of traditionally unorthodox - yet often intricate, adroit, ingenious, and astute - versions of cultural capital evident in the habitus of many young people be made to count, be recognised, be valuated? Can a process of educational assessment be used as a marketplace, a field of capital exchange? This paper reports on the development of an innovative approach to assessment in an alternative education institution designed for the re-engagement of „at risk‟ youth who have left formal schooling. In order to capture the broad range of students‟ cultural and social capital, an electronic portfolio system (EPS) is under trial. The model draws on categories from sociological models of capital and reconceptualises the eportfolio as a sociocultural zone of learning and development. Initial results from the trial show a general tendency towards engagement with the EPS and potential for the attainment of socially valued cultural capital in the form of school credentials.
Resumo:
Background Although risk of human papillomavirus (HPV)–associated cancers of the anus, cervix, oropharynx, penis, vagina, and vulva is increased among persons with AIDS, the etiologic role of immunosuppression is unclear and incidence trends for these cancers over time, particularly after the introduction of highly active antiretroviral therapy in 1996, are not well described. Methods Data on 499 230 individuals diagnosed with AIDS from January 1, 1980, through December 31, 2004, were linked with cancer registries in 15 US regions. Risk of in situ and invasive HPV-associated cancers, compared with that in the general population, was measured by use of standardized incidence ratios (SIRs) and 95% confidence intervals (CIs). We evaluated the relationship of immunosuppression with incidence during the period of 4–60 months after AIDS onset by use of CD4 T-cell counts measured at AIDS onset. Incidence during the 4–60 months after AIDS onset was compared across three periods (1980–1989, 1990–1995, and 1996–2004). All statistical tests were two-sided. Results Among persons with AIDS, we observed statistically significantly elevated risk of all HPV-associated in situ (SIRs ranged from 8.9, 95% CI = 8.0 to 9.9, for cervical cancer to 68.6, 95% CI = 59.7 to 78.4, for anal cancer among men) and invasive (SIRs ranged from 1.6, 95% CI = 1.2 to 2.1, for oropharyngeal cancer to 34.6, 95% CI = 30.8 to 38.8, for anal cancer among men) cancers. During 1996–2004, low CD4 T-cell count was associated with statistically significantly increased risk of invasive anal cancer among men (relative risk [RR] per decline of 100 CD4 T cells per cubic millimeter = 1.34, 95% CI = 1.08 to 1.66, P = .006) and non–statistically significantly increased risk of in situ vagina or vulva cancer (RR = 1.52, 95% CI = 0.99 to 2.35, P = .055) and of invasive cervical cancer (RR = 1.32, 95% CI = 0.96 to 1.80, P = .077). Among men, incidence (per 100 000 person-years) of in situ and invasive anal cancer was statistically significantly higher during 1996–2004 than during 1990–1995 (61% increase for in situ cancers, 18.3 cases vs 29.5 cases, respectively; RR = 1.71, 95% CI = 1.24 to 2.35, P < .001; and 104% increase for invasive cancers, 20.7 cases vs 42.3 cases, respectively; RR = 2.03, 95% CI = 1.54 to 2.68, P < .001). Incidence of other cancers was stable over time. Conclusions Risk of HPV-associated cancers was elevated among persons with AIDS and increased with increasing immunosuppression. The increasing incidence for anal cancer during 1996–2004 indicates that prolonged survival may be associated with increased risk of certain HPV-associated cancers.
Resumo:
Introduction—Human herpesvirus 8 (HHV8) is necessary for Kaposi sarcoma (KS) to develop, but whether peripheral blood viral load is a marker of KS burden (total number of KS lesions), KS progression (the rate of eruption of new KS lesions), or both is unclear. We investigated these relationships in persons with AIDS. Methods—Newly diagnosed patients with AIDS-related KS attending Mulago Hospital, in Kampala, Uganda, were assessed for KS burden and progression by questionnaire and medical examination. Venous blood samples were taken for HHV8 load measurements by PCR. Associations were examined with odds ratio (OR) and 95% confidence intervals (CI) from logistic regression models and with t-tests. Results—Among 74 patients (59% men), median age was 34.5 years (interquartile range [IQR], 28.5-41). HHV8 DNA was detected in 93% and quantified in 77% patients. Median virus load was 3.8 logs10/106 peripheral blood cells (IQR 3.4-5.0) and was higher in men than women (4.4 vs. 3.8 logs; p=0.04), in patients with faster (>20 lesions per year) than slower rate of KS lesion eruption (4.5 vs. 3.6 logs; p<0.001), and higher, but not significantly, among patients with more (>median [20] KS lesions) than fewer KS lesions (4.4 vs. 4.0 logs; p=0.16). HHV8 load was unrelated to CD4 lymphocyte count (p=0.23). Conclusions—We show significant association of HHV8 load in peripheral blood with rate of eruption of KS lesions, but not with total lesion count. Our results suggest that viral load increases concurrently with development of new KS lesions.
Resumo:
Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.
Resumo:
This study proposes a full Bayes (FB) hierarchical modeling approach in traffic crash hotspot identification. The FB approach is able to account for all uncertainties associated with crash risk and various risk factors by estimating a posterior distribution of the site safety on which various ranking criteria could be based. Moreover, by use of hierarchical model specification, FB approach is able to flexibly take into account various heterogeneities of crash occurrence due to spatiotemporal effects on traffic safety. Using Singapore intersection crash data(1997-2006), an empirical evaluate was conducted to compare the proposed FB approach to the state-of-the-art approaches. Results show that the Bayesian hierarchical models with accommodation for site specific effect and serial correlation have better goodness-of-fit than non hierarchical models. Furthermore, all model-based approaches perform significantly better in safety ranking than the naive approach using raw crash count. The FB hierarchical models were found to significantly outperform the standard EB approach in correctly identifying hotspots.
Resumo:
School reform is a matter of both redistributive social justice and recognitive social justice. Following Fraser (Justice interruptus: critical reflections on the “postsocialist” condition. Routledge, New York, 1997), we begin from a philosophical and political commitment to the more equitable redistribution of knowledge, credentials, competence, and capacity to children of low socioeconomic, cultural, and linguistic minority and Indigenous communities whose access, achievement, and participation historically have “lagged” behind system norms and benchmarks set by middle class and dominant culture communities. At the same time, we argue that the recognition of these students and their communities’ lifeworlds, knowledges, and experiences in the curriculum, in classroom teaching, and learning is both a means and an end: a means toward improved achievement measured conventionally and a goal for reform and alteration of mainstream curriculum knowledge and what is made to count in the school as valued cultural knowledge and practice. The work that we report here was based on an ongoing 4-year project where a team of university teacher educators/researchers have partnered with school leadership and staff to build relationships within community. The purpose has been to study whether and how engagement with new digital arts and multimodal literacies could have effects on students “conventional” print literacy achievement and, secondly, to study whether and how the overall performance of a school could be generated through a focus on professional conversations and partnerships in curriculum and instruction – rather than the top-down implementation of a predetermined pedagogical scheme, package, or approach.
Resumo:
Particles emitted by vehicles are known to cause detrimental health effects, with their size and oxidative potential among the main factors responsible. Therefore, understanding the relationship between traffic composition and both the physical characteristics and oxidative potential of particles is critical. To contribute to the limited knowledge base in this area, we investigated this relationship in a 4.5 km road tunnel in Brisbane, Australia. On-road concentrations of ultrafine particles (<100 nm, UFPs), fine particles (PM2.5), CO, CO2 and particle associated reactive oxygen species (ROS) were measured using vehicle-based mobile sampling. UFPs were measured using a condensation particle counter and PM2.5 with a DustTrak aerosol photometer. A new profluorescent nitroxide probe, BPEAnit, was used to determine ROS levels. Comparative measurements were also performed on an above-ground road to assess the role of emission dilution on the parameters measured. The profile of UFP and PM2.5 concentration with distance through the tunnel was determined, and demonstrated relationships with both road gradient and tunnel ventilation. ROS levels in the tunnel were found to be high compared to an open road with similar traffic characteristics, which was attributed to the substantial difference in estimated emission dilution ratios on the two roadways. Principal component analysis (PCA) revealed that the levels of pollutants and ROS were generally better correlated with total traffic count, rather than the traffic composition (i.e. diesel and gasoline-powered vehicles). A possible reason for the lack of correlation with HDV, which has previously been shown to be strongly associated with UFPs especially, was the low absolute numbers encountered during the sampling. This may have made their contribution to in-tunnel pollution largely indistinguishable from the total vehicle volume. For ROS, the stronger association observed with HDV and gasoline vehicles when combined (total traffic count) compared to when considered individually may signal a role for the interaction of their emissions as a determinant of on-road ROS in this pilot study. If further validated, this should not be overlooked in studies of on- or near-road particle exposure and its potential health effects.
Resumo:
Trauma to the spinal cord creates an initial physical injury damaging neurons, glia, and blood vessels, which then induces a prolonged inflammatory response, leading to secondary degeneration of spinal cord tissue, and further loss of neurons and glia surrounding the initial site of injury. Angiogenesis is a critical step in tissue repair, but in the injured spinal cord angiogenesis fails; blood vessels formed initially later regress. Stabilizing the angiogenic response is therefore a potential target to improve recovery after spinal cord injury (SCI). Vascular endothelial growth factor (VEGF) can initiate angiogenesis, but cannot sustain blood vessel maturation. Platelet-derived growth factor (PDGF) can promote blood vessel stability and maturation. We therefore investigated a combined application of VEGF and PDGF as treatment for traumatic spinal cord injury, with the aim to reduce secondary degeneration by promotion of angiogenesis. Immediately after hemisection of the spinal cord in the rat we delivered VEGF and PDGF and to the injury site. One and 3 months later the size of the lesion was significantly smaller in the treated group compared to controls, and there was significantly reduced gliosis surrounding the lesion. There was no significant effect of the treatment on blood vessel density, although there was a significant reduction in the numbers of macrophages/microglia surrounding the lesion, and a shift in the distribution of morphological and immunological phenotypes of these inflammatory cells. VEGF and PDGF delivered singly exacerbated secondary degeneration, increasing the size of the lesion cavity. These results demonstrate a novel therapeutic intervention for SCI, and reveal an unanticipated synergy for these growth factors whereby they modulated inflammatory processes and created a microenvironment conducive to axon preservation/sprouting.
Resumo:
The overall aim of this project was to contribute to existing knowledge regarding methods for measuring characteristics of airborne nanoparticles and controlling occupational exposure to airborne nanoparticles, and to gather data on nanoparticle emission and transport in various workplaces. The scope of this study involved investigating the characteristics and behaviour of particles arising from the operation of six nanotechnology processes, subdivided into nine processes for measurement purposes. It did not include the toxicological evaluation of the aerosol and therefore, no direct conclusion was made regarding the health effects of exposure to these particles. Our research included real-time measurement of sub, and supermicrometre particle number and mass concentration, count median diameter, and alveolar deposited surface area using condensation particle counters, an optical particle counter, DustTrak photometer, scanning mobility particle sizer, and nanoparticle surface area monitor, respectively. Off-line particle analysis included scanning and transmission electron microscopy, energy-dispersive x-ray spectrometry, and thermal optical analysis of elemental carbon. Sources of fibrous and non-fibrous particles were included.
Resumo:
Potential adverse effects on children health may result from school exposure to airborne particles. To address this issue, measurements in terms of particle number concentration, particle size distribution and black carbon (BC) concentrations were performed in three school buildings in Cassino (Italy) and its suburbs, outside and inside of the classrooms during normal occupancy and use. Additional time resolved information was gathered on ventilation condition, classroom activity, and traffic count data around the schools were obtained using a video camera. Across the three investigated school buildings, the outdoor and indoor particle number concentration monitored down to 4 nm and up to 3 m ranged from 2.8×104 part cm-3 to 4.7×104 part cm-3 and from 2.0×104 part cm-3 to 3.5×104 part cm-3, respectively. The total particle concentrations were usually higher outdoors than indoors, because no indoor sources were detected. I/O measured was less than 1 (varying in a relatively narrow range from 0.63 to 0.74), however one school exhibited indoor concentrations higher than outdoor during the morning rush hours. Particle size distribution at the outdoor site showed high particle concentrations in different size ranges, varying during the day; in relation to the starting and finishing of school time two modes were found. BC concentrations were 5 times higher at the urban school compared with the suburban and suburban-to-urban differences were larger than the relative differences of ultrafine particle concentrations.