145 resultados para CENTERS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Midwest Independent Transmission System Operator (MISO) has experienced significant amounts of wind power development within the last decade. The MISO footprint spans the majority of the upper Midwest region of the country, from the Dakotas to Indiana and as far east as Michigan. These areas have a rich wind energy resource. States in the MISO footprint have passed laws or set goals that require load serving entities to supply a portion of their load using renewable energy. In order to meet these requirements, significant investments are needed to build the transmission infrastructure necessary to deliver the power from these often remote wind energy resources to the load centers. This paper presents some of the transmission planning related work done at MISO which was largely influenced by current and future needs for increased wind power generation in the footprint. Specifically, topics covered are generator interconnection, long-term planning coordination, and cost-allocation for new transmission lines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Midwestern US is a wind-rich resource and wind power is being developed in this region at a very brisk pace. Transporting this energy resource to load centers invariably requires massive transmission lines. This issue of developing additional transmission to support reliable integration of wind on to the power grid provides a multitude of interesting challenges spanning various areas of power systems such as transmission planning, real-time operations and cost-allocation for new transmission. The Midwest ISO as a regional transmission provider is responsible for processing requests to interconnect proposed generation on to the transmission grid under its purview. This paper provides information about some of the issues faced in performing interconnection planning studies and Midwest ISO's efforts to improve its generator interconnection procedures. Related cost-allocation efforts currently ongoing at the Midwest ISO to streamline integration of bulk quantities of wind power in to the transmission grid are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Problem, research strategy, and findings: The privatization of airports in Australia included airport property development rights, regulated only by federal, not local, land use control. Airports then developed commercial and retail centers outside local community plans, resulting in a history of poor coordination of planning and reflecting strong differences between public and private values in the role of the airport. Private owners embraced the concept of an Airport City, envisioning the airport as a portal of global infrastructure, whereas public planning agencies are struggling with infrastructure coordination and the development of real estate outside of the local planning regulations. Stakeholder workshops were conducted in each of the cases where key stakeholders from airports, regulating agencies, state and local governments participated in identifying key issues impacting the planning in and around airports. This research demonstrates that if modes of infrastructure provision change significantly (such as through privatization of public services), that transformation would best be accompanied by comprehensive changes in planning regimes to accommodate metropolitan and airport interdependencies. Privatization has exacerbated the poor coordination of planning in the past, and a focus on coordination between public and private infrastructure planning is needed to overcome differences in values and interests. Takeaway for practice: Governance styles differ considerably between public agencies and private corporations. Planners should understand the drivers and value differences to better coordinate infrastructure delivery and effective planning. Research support: The Airport Metropolis Research Project under the Australian Research Council's Linkage Projects funding scheme (LP0775225).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rural, regional, and remote settlements in Australia require resilient infrastructure to remain sustainable in a context characterized by frequent large-scale natural disasters, long distances between urban centers, and the pressures of economic change. A critical aspect of this infrastructure is the air services network, a system of airports, aircraft operators, and related industries that enables the high-speed movement of people, goods, and services to remote locations. A process of deregulation during the 1970s and 1980s resulted in many of these airports passing into local government and private ownership, and the rationalization of the industry saw the closure of a number of airlines and airports. This paper examines the impacts of deregulation on the resilience of air services and the contribution that they make to regional and rural communities. In particular, the robustness, redundancy, resourcefulness, and rapidity of the system are examined. The conclusion is that while the air services network has remained resilient in a situation of considerable change, the pressures of commercialization and the tendency to manage aspects of the system in isolation have contributed to a potential decrease in overall resilience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose The purpose of this study was to investigate the nature and prevalence of discrimination against people living with HIV/AIDS in West Bengal, India, and how discrimination is associated with depression, suicidal ideation and suicidal attempts. Method Semi-structured interviews and the Beck Depression Inventory were administered to 105 HIV infected persons recruited by incidental sampling, at an Integrated Counseling and Testing Center (ICTC) and through Networks of People Living with HIV/AIDS, in the West Bengal area. Results Findings showed that 40.8% of the sample has experienced discrimination at least in one social setting – such as family (29.1%), health centers (18.4%), community (17.5%) and workplace (6.8%). About two-fifths (40.8%) reported experiencing discrimination in multiple social settings. Demographic factors associated with discrimination were gender, age, occupation, education, and current residence. More than half of the sample was suffering from severe depression while 8.7% had attempted suicide. Discrimination in most areas was significantly associated with suicidal ideation and suicidal attempts. Conclusions Prevalence of discrimination associated with HIV/AIDS is high in our sample from West Bengal. While discrimination was not associated with depressive symptomatology, discrimination was associated with suicidal ideation and attempts. These findings suggest that there is an urgent need for interventions to reduce discrimination of HIV/AIDS in the West Bengal region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To establish risk factors for moderate and severe microbial keratitis among daily contact lens (CL) wearers in Australia. Design: A prospective, 12-month, population-based, case-control study. Participants: New cases of moderate and severe microbial keratitis in daily wear CL users presenting in Australia over a 12-month period were identified through surveillance of all ophthalmic practitioners. Case detection was augmented by record audits at major ophthalmic centers. Controls were users of daily wear CLs in the community identified using a national telephone survey. Testing: Cases and controls were interviewed by telephone to determine subject demographics and CL wear history. Multiple binary logistic regression was used to determine independent risk factors and univariate population attributable risk percentage (PAR%) was estimated for each risk factor.; Main Outcome Measures: Independent risk factors, relative risk (with 95% confidence intervals [CIs]), and PAR%. Results: There were 90 eligible moderate and severe cases related to daily wear of CLs reported during the study period. We identified 1090 community controls using daily wear CLs. Independent risk factors for moderate and severe keratitis while adjusting for age, gender, and lens material type included poor storage case hygiene 6.4× (95% CI, 1.9-21.8; PAR, 49%), infrequent storage case replacement 5.4× (95% CI, 1.5-18.9; PAR, 27%), solution type 7.2× (95% CI, 2.3-22.5; PAR, 35%), occasional overnight lens use (<1 night per week) 6.5× (95% CI, 1.3-31.7; PAR, 23%), high socioeconomic status 4.1× (95% CI, 1.2-14.4; PAR, 31%), and smoking 3.7× (95% CI, 1.1-12.8; PAR, 31%). Conclusions: Moderate and severe microbial keratitis associated with daily use of CLs was independently associated with factors likely to cause contamination of CL storage cases (frequency of storage case replacement, hygiene, and solution type). Other factors included occasional overnight use of CLs, smoking, and socioeconomic class. Disease load may be considerably reduced by attention to modifiable risk factors related to CL storage case practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose an approach which attempts to solve the problem of surveillance event detection, assuming that we know the definition of the events. To facilitate the discussion, we first define two concepts. The event of interest refers to the event that the user requests the system to detect; and the background activities are any other events in the video corpus. This is an unsolved problem due to many factors as listed below: 1) Occlusions and clustering: The surveillance scenes which are of significant interest at locations such as airports, railway stations, shopping centers are often crowded, where occlusions and clustering of people are frequently encountered. This significantly affects the feature extraction step, and for instance, trajectories generated by object tracking algorithms are usually not robust under such a situation. 2) The requirement for real time detection: The system should process the video fast enough in both of the feature extraction and the detection step to facilitate real time operation. 3) Massive size of the training data set: Suppose there is an event that lasts for 1 minute in a video with a frame rate of 25fps, the number of frames for this events is 60X25 = 1500. If we want to have a training data set with many positive instances of the event, the video is likely to be very large in size (i.e. hundreds of thousands of frames or more). How to handle such a large data set is a problem frequently encountered in this application. 4) Difficulty in separating the event of interest from background activities: The events of interest often co-exist with a set of background activities. Temporal groundtruth typically very ambiguous, as it does not distinguish the event of interest from a wide range of co-existing background activities. However, it is not practical to annotate the locations of the events in large amounts of video data. This problem becomes more serious in the detection of multi-agent interactions, since the location of these events can often not be constrained to within a bounding box. 5) Challenges in determining the temporal boundaries of the events: An event can occur at any arbitrary time with an arbitrary duration. The temporal segmentation of events is difficult and ambiguous, and also affected by other factors such as occlusions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a framework for both gradient descent image and object alignment in the Fourier domain. Our method centers upon the classical Lucas & Kanade (LK) algorithm where we represent the source and template/model in the complex 2D Fourier domain rather than in the spatial 2D domain. We refer to our approach as the Fourier LK (FLK) algorithm. The FLK formulation is advantageous when one pre-processes the source image and template/model with a bank of filters (e.g. oriented edges, Gabor, etc.) as: (i) it can handle substantial illumination variations, (ii) the inefficient pre-processing filter bank step can be subsumed within the FLK algorithm as a sparse diagonal weighting matrix, (iii) unlike traditional LK the computational cost is invariant to the number of filters and as a result far more efficient, and (iv) this approach can be extended to the inverse compositional form of the LK algorithm where nearly all steps (including Fourier transform and filter bank pre-processing) can be pre-computed leading to an extremely efficient and robust approach to gradient descent image matching. Further, these computational savings translate to non-rigid object alignment tasks that are considered extensions of the LK algorithm such as those found in Active Appearance Models (AAMs).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To determine the prevalence, severity, location, etiology, treatment, and healing of medical device-related pressure ulcers in intensive care patients for up to 7 days. Design: Prospective repeated measures study. Setting and participants: Patients in 6 intensive care units of 2 major medical centers, one each in Australia and the United States, were screened 1 day per month for 6 months. Those with device-related ulcers were followed daily up to 7 days. Outcome measures: Device-related ulcer prevalence, pain, infection, treatment, healing. Results: 15/483 patients had device-related ulcers and 9/15 with 11 ulcers were followed beyond screening. Their mean age was 60.5 years, most were men, over-weight, and at increased pressure ulcer risk. Endotracheal and nasogastric tubes were the cause of most device-related ulcers. Repositioning was the most frequent treatment. 4/11 ulcers healed within the 7 day observation period. Conclusion: Device-related ulcer prevalence was 3.1%, similar to that reported in the limited literature available, indicating an ongoing problem. Systematic assessment and repositioning of devices are the mainstays of care. We recommend continued prevalence determination and that nurses remain vigilant to prevent device-related ulcers, especially in patients with nasogastric and endotracheal tubes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As of June 2009, 361 genome-wide association studies (GWAS) had been referenced by the HuGE database. GWAS require DNA from many thousands of individuals, relying on suitable DNA collections. We recently performed a multiple sclerosis (MS) GWAS where a substantial component of the cases (24%) had DNA derived from saliva. Genotyping was done on the Illumina genotyping platform using the Infinium Hap370CNV DUO microarray. Additionally, we genotyped 10 individuals in duplicate using both saliva- and blood-derived DNA. The performance of blood- versus saliva-derived DNA was compared using genotyping call rate, which reflects both the quantity and quality of genotyping per sample and the “GCScore,” an Illumina genotyping quality score, which is a measure of DNA quality. We also compared genotype calls and GCScores for the 10 sample pairs. Call rates were assessed for each sample individually. For the GWAS samples, we compared data according to source of DNA and center of origin. We observed high concordance in genotyping quality and quantity between the paired samples and minimal loss of quality and quantity of DNA in the saliva samples in the large GWAS sample, with the blood samples showing greater variation between centers of origin. This large data set highlights the usefulness of saliva DNA for genotyping, especially in high-density single-nucleotide polymorphism microarray studies such as GWAS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The weather forecast centers in Australia and many other countries use a scale of cyclone intensity categories (categories 1-5) in their cyclone advisories, which are considered to be indicative of the cyclone damage potential. However, this scale is mainly based on maximum gust wind speeds. In a recent research project involving computer modeling of cyclonic wind forces on roof claddings and fatigue damage to claddings, it was found that cyclone damage not only depends on the maximum gust wind speed, but also on two other cyclone parameters, namely, the forward speed and radius to maximum winds. This paper describes the computer model used in predicting the cyclone damage to claddings and investigates the damage potential of a cyclone as a function of all the relevant cyclone parameters, based on which it attempts to refine the current scale of cyclone intensity categories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Bronchiectasis unrelated to cystic fibrosis (CF) is being increasingly recognized in children and adults globally, both in resource-poor and in affluent countries. However, high-quality evidence to inform management is scarce. Oral amoxycillin-clavulanate is often the first antibiotic chosen for non-severe respiratory exacerbations, because of the antibiotic-susceptibility patterns detected in the respiratory pathogens commonly associated with bronchiectasis. Azithromycin has a prolonged half-life, and with its unique anti-bacterial, immunomodulatory, and anti-inflammatory properties, presents an attractive alternative. Our proposed study will test the hypothesis that oral azithromycin is non-inferior (within a 20% margin) to amoxycillin-clavulanate at achieving resolution of non-severe respiratory exacerbations by day 21 of treatment in children with non-CF bronchiectasis. Methods This will be a multicenter, randomized, double-blind, double-dummy, placebo-controlled, parallel group trial involving six Australian and New Zealand centers. In total, 170 eligible children will be stratified by site and bronchiectasis etiology, and randomized (allocation concealed) to receive: 1) azithromycin (5 mg/kg daily) with placebo amoxycillin-clavulanate or 2) amoxycillin-clavulanate (22.5 mg/kg twice daily) with placebo azithromycin for 21 days as treatment for non-severe respiratory exacerbations. Clinical data and a parent-proxy cough-specific quality of life (PC-QOL) score will be obtained at baseline, at the start and resolution of exacerbations, and on day 21. In most children, blood and deep-nasal swabs will also be collected at the same time points. The primary outcome is the proportion of children whose exacerbations have resolved at day 21. The main secondary outcome is the PC-QOL score. Other outcomes are: time to next exacerbation; requirement for hospitalization; duration of exacerbation, and spirometry data. Descriptive viral and bacteriological data from nasal samples and blood inflammatory markers will be reported where available. Discussion Currently, there are no published randomized controlled trials (RCT) to underpin effective, evidence-based management of acute respiratory exacerbations in children with non-CF bronchiectasis. To help address this information gap, we are conducting two RCTs. The first (bronchiectasis exacerbation study; BEST-1) evaluates the efficacy of azithromycin and amoxycillin-clavulanate compared with placebo, and the second RCT (BEST-2), described here, is designed to determine if azithromycin is non-inferior to amoxycillin-clavulanate in achieving symptom resolution by day 21 of treatment in children with acute respiratory exacerbations. Trial registration Australia and New Zealand Clinical Trials Register (ANZCTR) number ACTRN12612000010897. http://www.anzctr.org.au/trial_view.aspx?id=347879

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building knowledge economies seems synonymous with re-imaging urban fabrics. Cities producing vibrant public realms are believed to have better success in distinguishing themselves within a highly competitive market. Many governments are heavily investing in cultural enhancements burgeoning distinctive cosmopolitan centers of which public art is emerging as a significant stakeholder. Brisbane’s goal to grow a knowledge-based economy similarly addresses public art. To stimulate engagement with public art Brisbane City Council has delivered an online public art catalogue and assembled three public art trails, with a fourth newly augmented. While many pieces along these trails are obviously public others question the term ‘public’ through an obscured milieu where a ‘look but don’t touch’ policy is subtly implied. This study investigates the interactional relationship between publics and public art, and in doing so, explores the concept of accessibility. This paper recommends that installations of sculpture within an emerging city should be considered in terms of economic output measured through the degree in which the public engages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Modern series from high-volume esophageal centers report an approximate 40% 5-year survival in patients treated with curative intent and postoperative mortality rates of less than 4%. An objective analysis of factors that underpin current benchmarks within high-volume centers has not been performed. Methods: Three time periods were studied, 1990 to 1998 (period 1), 1999 to 2003 (period 2), and 2004 to 2008 (period 3), in which 471, 254, and 342 patients, respectively, with esophageal cancer were treated with curative intent. All data were prospectively recorded, and staging, pathology, treatment, operative, and oncologic outcomes were compared. Results: Five-year disease-specific survival was 28%, 35%, and 44%, and in-hospital postoperative mortality was 6.7%, 4.4%, and 1.7% for periods 1 to 3, respectively (P < .001). Period 3, compared with periods 1 and 2, respectively, was associated with significantly (P < .001) more early tumors (17% vs 4% and 6%), higher nodal yields (median 22 vs 11 and 18), and a higher R0 rate in surgically treated patients (81% vs 73% and 75%). The use of multimodal therapy increased (P < .05) across time periods. By multivariate analysis, age, T stage, N stage, vascular invasion, R status, and time period were significantly (P < .0001) associated with outcome. Conclusions: Improved survival with localized esophageal cancer in the modern era may reflect an increase of early tumors and optimized staging. Important surgical and pathologic standards, including a higher R0 resection rate and nodal yields, and lower postoperative mortality, were also observed. Copyright © 2012 by The American Association for Thoracic Surgery.