292 resultados para Total reducing sugars


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background There is considerable and ongoing debate about the role and effectiveness of school-based injury prevention programs in reducing students’ later involvement in alcohol associated transport injuries. Most relevant literature is concerned with pre-driving and licensing programs for middle age range adolescents (15-17 years). This research team is concerned with prevention at an earlier stage by targeting interventions to young adolescents (13-14 years). There is strong evidence that young adolescents who engage in unsafe and illegal alcohol associated transport risks are significantly likely to incur serious related injuries in longitudinal follow up. For example, a state-wide representative sample of male adolescents (mean age 14.5 years) who reported being passengers of drink drivers were significantly more likely to have incurred a hospitalised injury related to traffic events at a 20 year follow up. Aim This paper reports on first aid training integrated with peer protection and school connectedness within the Skills for Preventing Injury in Youth (SPIY) program. A component of the intervention is concerned with providing strategies to reduce the likelihood of being a passenger of a drink driver and effectiveness is followed up at six months post-intervention. Method In early 2012 the study was undertaken in 35 high schools throughout Queensland that were randomly assigned to intervention and control conditions. A total of 2,521 Year 9 students (mean age 13.5years, 43% male) completed surveys prior to the intervention. Results Of these students 316 (13.7%) reported having ridden in a car with someone who has been drinking. This is a traffic safety behaviour that is particularly relevant to a peer protection intervention and the findings of the six month follow up will be reported. Discussion and conclusions This research will provide evidence as to whether this approach to the introduction of first aid skills within a school-based health education curriculum has traffic safety implications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence, reducing the frequency of crashes assists in addressing congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a short time window around the time of a crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists. We will compare them with normal traffic trends and show this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding to traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash. Using the K-Means clustering method with Euclidean distance function allowed the crashes to be clustered. Then, normal situation data was extracted based on the time distribution of crashes and were clustered to compare with the “high risk” clusters. Five major trends have been found in the clustering results for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Based on these findings, crash likelihood estimation models can be fine-tuned based on the monitored traffic conditions with a sliding window of 30 minutes to increase accuracy of the results and minimize false alarms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND/OBJECTIVES: Recent work suggests that macronutrients are pro-inflammatory and promote oxidative stress. Reports of postprandial regulation of total adiponectin have been mixed, and there is limited information regarding postprandial changes in high molecular weight (HMW) adiponectin. The aim of this study was to assess the effect of a standardised high-fat meal on metabolic variables, adiponectin (total and HMW), and markers of inflammation and oxidative stress in: (i) lean, (ii) obese non-diabetic and (iii) men with type 2 diabetes mellitus (T2DM). SUBJECTS/METHODS: Male subjects: lean (n=10), obese (n=10) and T2DM (n=10) were studied for 6 h following both a high-fat meal and water control. Metabolic variables (glucose, insulin, triglycerides), inflammatory markers (interleukin-6 (IL6), tumour necrosis factor (TNF)α, high-sensitivity C-reactive protein (hsCRP), nuclear factor (NF)κB expression in peripheral blood mononuclear cells (p65)), indicators of oxidative stress (oxidised low density lipoprotein (oxLDL), protein carbonyl) and adiponectin (total and HMW) were measured. RESULTS: No significant changes in TNFα, p65, oxLDL or protein carbonyl concentrations were observed. Overall, postprandial IL6 decreased in subjects with T2DM but increased in lean subjects, whereas hsCRP decreased in the lean cohort and increased in obese subjects. There was no overall postprandial change in total or HMW adiponectin in any group. Total adiponectin concentrations changed over time following the water control, and the response was significantly different in lean subjects compared with subjects with T2DM (P=0.04). CONCLUSIONS: No consistent significant postprandial inflammation, oxidative stress or regulation of adiponectin was observed in this study. Findings from the water control suggest differential basal regulation of total adiponectin in T2DM compared with lean controls.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The key to reducing cost of electric vehicles is integration. All too often systems such as the motor, motor controller, batteries and vehicle chassis/body are considered as separate problems. The truth is that a lot of trade-offs can be made between these systems, causing an overall improvement in many areas including total cost. Motor controller and battery cost have a relatively simple relationship; the less energy lost in the motor controller the less energy that has to be carried in the batteries, hence the lower the battery cost. A motor controller’s cost is primarily influenced by the cost of the switches. This paper will therefore present a method of assessing the optimal switch selection on the premise that the optimal switch is the one that produces the lowest system cost, where system cost is the cost of batteries + switches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Intensity modulated radiotherapy (IMRT) treatments require more beam-on time and produce more linac head leakage to deliver similar doses to conventional, unmodulated, radiotherapy treatments. It is necessary to take this increased leakage into account when evaluating the results of radiation surveys around bunkers that are, or will be, used for IMRT. The recommended procedure of 15 applying a monitor-unit based workload correction factor to secondary barrier survey measurements, to account for this increased leakage when evaluating radiation survey measurements around IMRT bunkers, can lead to potentially-costly over estimation of the required barrier thickness. This study aims to provide initial guidance on the validity of reducing the value of the correction factor when applied to different radiation barriers (primary barriers, doors, maze walls and other walls) by 20 evaluating three different bunker designs. Methods Radiation survey measurements of primary, scattered and leakage radiation were obtained at each of five survey points around each of three different radiotherapy bunkers and the contribution of leakage to the total measured radiation dose at each point was evaluated. Measurements at each survey point were made with the linac gantry set to 12 equidistant positions from 0 to 330o, to 25 assess the effects of radiation beam direction on the results. Results For all three bunker designs, less than 0.5% of dose measured at and alongside the primary barriers, less than 25% of the dose measured outside the bunker doors and up to 100% of the dose measured outside other secondary barriers was found to be caused by linac head leakage. Conclusions Results of this study suggest that IMRT workload corrections are unnecessary, for 30 survey measurements made at and alongside primary barriers. Use of reduced IMRT workload correction factors is recommended when evaluating survey measurements around a bunker door, provided that a subset of the measurements used in this study are repeated for the bunker in question. Reduction of the correction factor for other secondary barrier survey measurements is not recommended unless the contribution from leakage is separetely evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patients presenting for knee replacement on warfarin for medical reasons often require higher levels of anticoagulation peri-operatively than primary thromboprophylaxis and may require bridging therapy with heparin. We performed a retrospective case control study on 149 consecutive primary knee arthroplasty patients to investigate whether anti-coagulation affected short-term outcomes. Specific outcome measures indicated significant increases in prolonged wound drainage (26.8% of cases vs 7.3% of controls, p<0.001); superficial infection (16.8% vs 3.3%, p<0.001); deep infection (6.0% vs 0%, p<0.001); return-to-theatre for washout (4.7% vs 0.7%, p=0.004); and revision (4.7% vs 0.3%, p=0.001). Management of patients on long-term warfarin therapy following TKR is particularly challenging, as the surgeon must balance risk of thromboembolism against post-operative complications on an individual patient basis in order to optimise outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Successful anatomic fitting of a total artificial heart (TAH) is vital to achieve optimal pump hemodynamics after device implantation. Although many anatomic fitting studies have been completed in humans prior to clinical trials, few reports exist that detail the experience in animals for in vivo device evaluation. Optimal hemodynamics are crucial throughout the in vivo phase to direct design iterations and ultimately validate device performance prior to pivotal human trials. In vivo evaluation in a sheep model allows a realistically sized representation of a smaller patient, for which smaller third-generation TAHs have the potential to treat. Our study aimed to assess the anatomic fit of a single device rotary TAH in sheep prior to animal trials and to use the data to develop a threedimensional, computer-aided design (CAD)-operated anatomic fitting tool for future TAH development. Following excision of the native ventricles above the atrio-ventricular groove, a prototype TAH was inserted within the chest cavity of six sheep (28–40 kg).Adjustable rods representing inlet and outlet conduits were oriented toward the center of each atrial chamber and the great vessels, with conduit lengths and angles recorded for future analysis. A threedimensional, CAD-operated anatomic fitting tool was then developed, based on the results of this study, and used to determine the inflow and outflow conduit orientation of the TAH. The mean diameters of the sheep left atrium, right atrium, aorta, and pulmonary artery were 39, 33, 12, and 11 mm, respectively. The center-to-center distance and outer-edge-to-outer-edge distance between the atria, found to be 39 ± 9 mm and 72 ± 17 mm in this study, were identified as the most critical geometries for successful TAH connection. This geometric constraint restricts the maximum separation allowable between left and right inlet ports of a TAH to ensure successful alignment within the available atrial circumference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studied technology’s role in promoting and supporting active lifestyles through behavioural strategies to reduce sedentary time and increase physical activity. The five studies included (1) development of a self-report instrument quantifying daily sedentary behaviour and light-intensity physical activity; (2) establishment of instrument validity and reliability; (3) use of an online personal activity monitor to successfully reduce sedentary time and increase physical activity; (4) identification of positive differences in total wellness as related to high/low levels of sitting time combined with insufficient/sufficient physical activity; and (5) improvement of total wellness through positive changes in sedentary behaviour and physical activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate a rapid synthesis of gold nanoparticles using hydroquinone as a reducing agent under acidic conditions without the need for precursor seed particles. The nanoparticle formation process is facilitated by the addition of NaOH to a solution containing HAuCl4 and hydroquinone to locally change the pH; this enhances the reducing capability of hydroquinone to form gold nucleation centres, after which further growth of gold can take place through an autocatalytic mechanism. The stability of the nanoparticles is highly dependent on the initial solution pH, and both the concentration of added NaOH and hydroquinone present in solution. The gold nanoparticles were characterized by UV–visible spectroscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, atomic force microscopy, dynamic light scattering, and zeta potential measurements. It was found that under optimal conditions that stable aqueous suspensions of 20 nm diameter nanoparticles can be achieved where benzoquinone, the oxidized product of hydroquinone, acts as a capping agent preventing nanoparticles aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, infectious diseases and under-nutrition have been considered major health problems in Sri Lanka with little attention paid to obesity and associated non-communicable diseases (NCDs). However, the recent Sri Lanka Diabetes and Cardiovascular Study (SLDCS) reported the epidemic level of obesity, diabetes and metabolic syndrome. Moreover, obesity-associated NCDs is the leading cause of death in Sri Lanka and there is an exponential increase in hospitalization due to NCDs adversely affecting the development of the country. Despite Sri Lanka having a very high prevalence of NCDs and associated mortality, little is known about the causative factors for this burden. It is widely believed that the global NCD epidemic is associated with recent lifestyle changes, especially dietary factors. In the absence of sufficient data on dietary habits in Sri Lanka, successful interventions to manage these serious health issues would not be possible. In view of the current situation the dietary survey was undertaken to assess the intakes of energy, macro-nutrients and selected other nutrients with respect to socio demographic characteristics and the nutritional status of Sri Lankan adults especially focusing on obesity. Another aim of this study was to develop and validate a culturally specific food frequency questionnaire (FFQ) to assess dietary risk factors of NCDs in Sri Lankan adults. Data were collected from a subset of the national SLDCS using a multi-stage, stratified, random sampling procedure (n=500). However, data collection in the SLDCS was affected by the prevailing civil war which resulted in no data being collected from Northern and Eastern provinces. To obtain a nationally representative sample, additional subjects (n=100) were later recruited from the two provinces using similar selection criteria. Ethical Approval for this study was obtained from the Ethical Review Committee, Faculty of Medicine, University of Colombo, Sri Lanka and informed consent was obtained from the subjects before data were collected. Dietary data were obtained using the 24-h Dietary Recall (24HDR) method. Subjects were asked to recall all foods and beverages, consumed over the previous 24-hour period. Respondents were probed for the types of foods and food preparation methods. For the FFQ validation study, a 7-day weight diet record (7-d WDR) was used as the reference method. All foods recorded in the 24 HDR were converted into grams and then intake of energy and nutrients were analysed using NutriSurvey 2007 (EBISpro, Germany) which was modified for Sri Lankan food recipes. Socio-demographic details and body weight perception were collected from interviewer-administrated questionnaire. BMI was calculated and overweight (BMI ≥23 kg.m-2), obesity (BMI ≥25 kg.m-2) and abdominal obesity (Men: WC ≥ 90 cm; Women: WC ≥ 80 cm) were categorized according to Asia-pacific anthropometric cut-offs. The SPSS v. 16 for Windows and Minitab v10 were used for statistical analysis purposes. From a total of 600 eligible subjects, 491 (81.8%) participated of whom 34.5% (n=169) were males. Subjects were well distributed among different socio-economic parameters. A total of 312 different food items were recorded and nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Seventy-seven subjects completed (response rate = 65%) the FFQ and 7-day WDR. Estimated mean energy intake (SD) from FFQ (1794±398 kcal) and 7DWR (1698±333 kcal, P<0.001) was significantly different due to a significant overestimation of carbohydrate (~10 g/d, P<0.001) and to some extent fat (~5 g/d, NS). Significant positive correlations were found between the FFQ and 7DWR for energy (r = 0.39), carbohydrate (r = 0.47), protein (r = 0.26), fat (r =0.17) and dietary fiber (r = 0.32). Bland-Altman graphs indicated fairly good agreement between methods with no relationship between bias and average intake of each nutrient examined. The findings from the nutrition survey showed on average, Sri Lankan adults consumed over 14 portions of starch/d; moreover, males consumed 5 more portions of cereal than females. Sri Lankan adults consumed on average 3.56 portions of added sugars/d. Moreover, mean daily intake of fruit (0.43) and vegetable (1.73) portions was well below minimum dietary recommendations (fruits 2 portions/d; vegetables 3 portions/d). The total fruit and vegetable intake was 2.16 portions/d. Daily consumption of meat or alternatives was 1.75 portions and the sum of meat and pulses was 2.78 portions/d. Starchy foods were consumed by all participants and over 88% met the minimum daily recommendations. Importantly, nearly 70% of adults exceeded the maximum daily recommendation for starch (11portions/d) and a considerable proportion consumed larger numbers of starch servings daily, particularly men. More than 12% of men consumed over 25 starch servings/d. In contrast to their starch consumption, participants reported very low intakes of other food groups. Only 11.6%, 2.1% and 3.5% of adults consumed the minimum daily recommended servings of vegetables, fruits, and fruits and vegetables combined, respectively. Six out of ten adult Sri Lankans sampled did not consume any fruits. Milk and dairy consumption was extremely low; over a third of the population did not consume any dairy products and less than 1% of adults consumed 2 portions of dairy/d. A quarter of Sri Lankans did not report consumption of meat and pulses. Regarding protein consumption, 36.2% attained the minimum Sri Lankan recommendation for protein; and significantly more men than women achieved the recommendation of ≥3 servings of meat or alternatives daily (men 42.6%, women 32.8%; P<0.05). Over 70% of energy was derived from carbohydrates (Male:72.8±6.4%, Female:73.9±6.7%), followed by fat (Male:19.9±6.1%, Female:18.5±5.7%) and proteins (Male:10.6±2.1%, Female:10.9±5.6%). The average intake of dietary fiber was 21.3 g/day and 16.3 g/day for males and females, respectively. There was a significant difference in nutritional intake related to ethnicities, areas of residence, education levels and BMI categories. Similarly, dietary diversity was significantly associated with several socio-economic parameters among Sri Lankan adults. Adults with BMI ≥25 kg.m-2 and abdominally obese Sri Lankan adults had the highest diet diversity values. Age-adjusted prevalence (95% confidence interval) of overweight, obesity, and abdominal obesity among Sri Lankan adults were 17.1% (13.8-20.7), 28.8% (24.8-33.1), and 30.8% (26.8-35.2), respectively. Men, compared with women, were less overweight, 14.2% (9.4-20.5) versus 18.5% (14.4-23.3), P = 0.03, less obese, 21.0% (14.9-27.7) versus 32.7% (27.6-38.2), P < .05; and less abdominally obese, 11.9% (7.4-17.8) versus 40.6% (35.1-46.2), P < .05. Although, prevalence of obesity has reached to epidemic level body weight misperception was common among Sri Lankan adults. Two-thirds of overweight males and 44.7% of females considered themselves as in "about right weight". Over one third of both male and female obese subjects perceived themselves as "about right weight" or "underweight". Nearly 32% of centrally obese men and women perceived that their waist circumference is about right. People who perceived overweight or very overweight (n = 154) only 63.6% tried to lose their body weight (n = 98), and quarter of adults seek advices from professionals (n = 39). A number of important conclusions can be drawn from this research project. Firstly, the newly developed FFQ is an acceptable tool for assessing the nutrient intake of Sri Lankans and will assist proper categorization of individuals by dietary exposure. Secondly, a substantial proportion of the Sri Lankan population does not consume a varied and balanced diet, which is suggestive of a close association between the nutrition-related NCDs in the country and unhealthy eating habits. Moreover, dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Lastly, although obesity is a major health issue among Sri Lankan adults, body weight misperception was common among underweight, healthy weight, overweight, and obese adults in Sri Lanka. Over 2/3 of overweight and 1/3 of obese Sri Lankan adults believe that they are in "right weight" or "under-weight" categories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestion. Hence, reducing the frequency of crashes assist in addressing congestion issues (Meyer, 2008). Analysing traffic conditions and discovering risky traffic trends and patterns are essential basics in crash likelihood estimations studies and still require more attention and investigation. In this paper we will show, through data mining techniques, that there is a relationship between pre-crash traffic flow patterns and crash occurrence on motorways, compare them with normal traffic trends, and that this knowledge has the potentiality to improve the accuracy of existing crash likelihood estimation models, and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash occurrence. K-Means clustering algorithm applied to determine dominant pre-crash traffic patterns. In the first phase of this research, traffic regimes identified by analysing crashes and normal traffic situations using half an hour speed in upstream locations of crashes. Then, the second phase investigated the different combination of speed risk indicators to distinguish crashes from normal traffic situations more precisely. Five major trends have been found in the first phase of this paper for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Moreover, the second phase explains that spatiotemporal difference of speed is a better risk indicator among different combinations of speed related risk indicators. Based on these findings, crash likelihood estimation models can be fine-tuned to increase accuracy of estimations and minimize false alarms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Under the concept of Total Quality Control, based on their experience, the authors discussed potential demand for quality of immunization services and possible solutions to these demands. Abstract in Chinese 全面质量管理(total quality control,TQC)是在20世纪60年代由美国人V,Feigonbaum和J.unan先后提出的新的质量管理观念,众所周知的ISO9000族标准即建立在TQC理念下的质量管理标准,该标准已成为当今世界全球一致、最具权威的质量管理和质量保证的国际规则[1-2].21世纪是质量世纪,推行TQC,不断改进产品和服务质量,目前已成为我国各行各业在不断激烈的市场竞争下完善自我、保证生存和发展的重要手段.实施预防接种是预防和控制传染病,保护人群健康的重要措施,预防接种工作中,产品即预防接种服务,需方(顾客)为接受预防接种服务的广大人群,是产品的消费者.随社会的迅速发展,人们对健康需求的不断提高,对预防接种工作也提出了更高的质量要求.本文对TQC模式下顾客对预防接种服务的质量要求进行了综合分析,并对如何改进服务质量进行了初步探讨.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With unpredictable workloads and a need for a multitude of specialized skills, many main contractors rely heavily on subcontracting to reduce their risks (Bresnen et al., 1985; Beardsworth et al., 1988). This is especially the case In Hong Kong, where the average direct labour content accounts for only around 1% of the total contract sum (Lai, 1987). Extensive usage of subcontracting is also reported in many other countries, including the UK (Gray and Flanagan, 1989) and Japan (Bennett et al., 1987). In addition, and depending upon the scale and complexity of works, it is not uncommon for subcontractors to further sublet their works to lower tier(s) subcontractors. Richter and Mitchell (1982) argued that main contractors can obtain a higher profit margin by reducing their performance costs by subcontracting work to those who have the necessary resources to perform the work more efficiently and economically. Subcontracting is also used strategically to allow firms to employ a minimum work force under fluctuating demand (Usdiken and Sözen, 1985). Through subcontracting, the risks of main contractors are also reduced, as errors in estimating or additional costs caused by delays or extra labour requirements can be absorbed by the subcontractors involved (Woon and Ofori, 2000). Despite these benefits, the quality of work can suffer when incapable or inexperienced subcontractors are employed. Additional problems also exist in the form of bid shopping, unclear accountability, and high fragmentation (Palaneeswaran et al., 2002). A recent CIB TG 23 International Conference, October 2003, Hong Kong report produced by the Hong Kong Construction Industry Review Committee (CIRC) points to development of a framework to help distinguish between capable and incapable subcontractors (Tang, 2001). This paper describes research aims at identifying and prioritising criteria for use in such a framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sustainability has become an important principle to be pursued throughout the life-cycles of project development. Facility managers are in a commanding position to maximise the potential of sustainability. Sustainability endeavours in facility management (FM) practices will not only contribute to reducing energy consumption and waste, but will also help increase organisational productivity, financial returns and standing in the community. At the forefront of sustainable practices, FM professionals can exercise a great deal of influence through operational and strategic management and they should be empowered with the necessary knowledge and capabilities. However, literature studies suggest that there is a gap between the level of awareness and knowledge and the necessary skills required to promote sustainability endeavours in the FM profession. Therefore, it is worthwhile to reflect on people capability issues since it is considered as the key enabler in managing the sustainability agenda as well as being central to the improvement of competency and innovation in an organization. This paper aims to identify the critical factors for enhancing people capabilities in promoting the sustainability agenda in the FM sector. To achieve this objective, a total of 60 factors were identified through a comprehensive literature review and then a questionnaire survey with 52 respondents was conducted to collect the perceived importance of these factors. The survey analysis revealed 23 critical factors as significantly important. These critical factors will serve as the basis for the establishment of a mechanism to equip facility managers with the right knowledge, to continue education and training and to develop new mind-sets to enhance the implementation of sustainability measures in FM practices.