453 resultados para The Provident Institution for Savings in Jersey City


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The public apology to the Forgotten Australians in late 2009 was, for many, the culmination of a long campaign for recognition and justice. The groundswell for this apology was built through a series of submissions which documented the systemic institutionalised abuse and neglect experienced by the Forgotten Australians that has resulted, for some, in life-long disadvantage and marginalisation. Interestingly it seems that rather than the official documents being the catalyst for change and prompting this public apology, it was more often the personal stories of the Forgotten Australians that resonated and over time drew out quite a torrent of support from the public leading up to, during and after the public apology, just as had been the case with the ‘Stolen Generation.’ Research suggests (cite) that the ethics of such national apologies only make sense if their personal stories are seen as a collective responsibility of society, and only carry weight if we understand and seek to Nationally address the trauma experienced by such victims. In the case of the Forgotten Australians, the National Library of Australia’s Forgotten Australians and Former Child Migrants Oral History Project and the National Museum’s Inside project demonstrate commitment to the digitisation of the Forgotten Australians’ stories in order to promote a better public understanding of their experiences, and institutionally (and therefore formally) value them with renewed social importance. Our project builds on this work not by making or collecting more stories, but by examining the role of the internet and digital technologies used in the production and dissemination of individuals’ stories that have already been created during the period of time between the tabling of the senate inquiry, Children in Institutional Care (1999 or 2003?) and a formal National apology being delivered in Federal Parliament by PM Kevin Rudd (9 Nov, 2009?). This timeframe also represents the emergent first decade of Internet use by Australians, including the rapid easily accessible digital technologies and social media tools that were at our disposal, along with the promises the technology claimed to offer — that is that more people would benefit from the social connections these technologies allegedly were giving us.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many governments throughout the world rely heavily on traffic law enforcement programs to modify driver behaviour and enhance road safety. There are two related functions of traffic law enforcement, apprehension and deterrence, and these are achieved through three processes: the establishment of traffic laws, the policing of those laws, and the application of penalties and sanctions to offenders. Traffic policing programs can vary by visibility (overt or covert) and deployment methods (scheduled and non-scheduled), while sanctions can serve to constrain, deter or reform offending behaviour. This chapter will review the effectiveness of traffic law enforcement strategies from the perspective of a range of high-risk, illegal driving behaviours including drink/drug driving, speeding, seat belt use and red light running. Additionally, this chapter discusses how traffic police are increasingly using technology to enforce traffic laws and thus reduce crashes. The chapter concludes that effective traffic policing involves a range of both overt and covert operations and includes a mix of automatic and more traditional manual enforcement methods. It is important to increase both the perceived and actual risk of detection by ensuring that traffic law enforcement operations are sufficiently intensive, unpredictable in nature and conducted as widely as possible across the road network. A key means of maintaining the unpredictability of operations is through the random deployment of enforcement and/or the random checking of drivers. The impact of traffic enforcement is also heightened when it is supported by public education campaigns. In the future, technological improvements will allow the use of more innovative enforcement strategies. Finally, further research is needed to continue the development of traffic policing approaches and address emerging road safety issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in safety research—trying to improve the collective understanding of motor vehicle crash causes and contributing factors—rest upon the pursuit of numerous lines of research inquiry. The research community has focused considerable attention on analytical methods development (negative binomial models, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might logically seek to know which lines of inquiry might provide the most significant improvements in understanding crash causation and/or prediction. It is the contention of this paper that the exclusion of important variables (causal or surrogate measures of causal variables) cause omitted variable bias in model estimation and is an important and neglected line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant opportunities to better understand contributing factors and/or causes of crashes. This study examines the role of important variables (other than Average Annual Daily Traffic (AADT)) that are generally omitted from intersection crash prediction models. In addition to the geometric and traffic regulatory information of intersection, the proposed model includes many spatial factors such as local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools—representing a mix of potential environmental and human factors that are theoretically important, but rarely used. Results suggest that these variables in addition to AADT have significant explanatory power, and their exclusion leads to omitted variable bias. Provided is evidence that variable exclusion overstates the effect of minor road AADT by as much as 40% and major road AADT by 14%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction firms are increasingly utilizing information technologies to better manage geographically dispersed projects. Often these technologies involve changes to existing working practices and processes and are viewed as disruptive by members of the organization. Understanding the factors that can influence individuals’ intention to utilize technology can assist managers to implement strategies to increase and improve the uptake of technologies and improve the innovation adoption process. Using a case study organization, factors identified in the Unified Theory of Acceptance and Use of Technology (UTAUT) are examined and the UTAUT is extended and by including resistance to change and top management support. The findings indicate effort expectancy, internal facilitating conditions and top management support all influence individuals’ intention to use information technology. The results also show that resistance to change or fear of change does not always play a role in innovation adoption. The findings reinforce the need to support new technologies from both a managerial and technical perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While changes in work and employment practices in the mining sector have been profound, the literature addressing mining work is somewhat partial as it focuses primarily on the workplace as the key (or only) site of analysis, leaving the relationship between mining work and families and communities under-theorized. This article adopts a spatially oriented, case-study approach to the sudden closure of the Ravensthorpe nickel mine in the south-west of Western Australia to explore the interplay between the new scales and mobilities of labour and capital and work–family–community connections in mining. In the context of the dramatically reconfigured industrial arena of mining work, the study contributes to a theoretical engagement between employment relations and the spatial dimensions of family and community in resource-affected communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: In Singapore, motorcycle crashes account for 50% of traffic fatalities and 53% of injuries. While extensive research efforts have been devoted to improve the motorcycle safety, the relationship between the rider behavior and the crash risk is still not well understood. The objective of this study is to evaluate how behavioral factors influence crash risk and to identify the most vulnerable group of motorcyclists. Methods: To explore the rider behavior, a 61-item questionnaire examining sensation seeking (Zuckerman et al., 1978), impulsiveness (Eysenck et al., 1985), aggressiveness (Buss & Perry, 1992), and risk-taking behavior (Weber et al., 2002) was developed. A total of 240 respondents with at least one year riding experience form the sample that relate behavior to their crash history, traffic penalty awareness, and demographic characteristics. By clustering the crash risk using the medoid portioning algorithm, the log-linear model relating the rider behavior to crash risk was developed. Results and Discussions: Crash-involved motorcyclists scored higher in impulsive sensation seeking, aggression and risk-taking behavior. Aggressive and high risk-taking motorcyclists were respectively 1.30 and 2.21 times more likely to fall under the high crash involvement group while impulsive sensation seeking was not found to be significant. Based on the scores on risk-taking and aggression, the motorcyclists were clustered into four distinct personality combinations namely, extrovert (aggressive, impulsive risk-takers), leader (cautious, aggressive risk-takers), follower (agreeable, ignorant risk-takers), and introvert (self-consciousness, fainthearted risk-takers). “Extrovert” motorcyclists were most prone to crashes, being 3.34 times more likely to involve in crash and 8.29 times more vulnerable than theintrovert”. Mediating factors like demographic characteristics, riding experience, and traffic penalty awareness were found not to be significant in reducing crash risk. Conclusion: The findings of this study will be useful for road safety campaign planners to be more focused in the target group as well as those who employ motorcyclists for their delivery business.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prior in vitro studies, utilizing 31Pn uclear magnetic resonance (31PN MR) to measure the chemical shift (CT) of 0-ATP and lengthening of the phosphocreatine spin-spin (7"') relaxation time, suggested an assessment of their efficacy in measuring magnesium depletion in vivo. Dietary magnesium depletion (Me$) produced markedly lower magnesium in plasma (0.44 vs 1. I3 mmol/liter) and bone (1 30 vs 190 pmol/g) but much smaller changes in muscle (41 vs 45 pmol/g, P < 0.01), heart (42.5 vs 44.6 prnol/g), and brain (30 vs 32 pmollg). NMR experiments in anesthetized rats in a Bruker 7-T vertical bore magnet showed that in M e $ rats there was a significant change in brain j3-ATP shift (16.15 vs 16.03 ppm, P < 0.05). These chemical shifts gave a calculated free [Mg"] of 0.71 mM (control) and 0.48 mM (MgZ+$). In muscle the change in j3-ATP shift was not significant (Me$ 15.99 ppm, controls 15.96 ppm), corresponding to a calculated free M P of 0.83 and 0.95 mM, respectively. Phosphccreatine Tz (Carr-Purcell, spin-echo pulse sequence) was no different with M e $ in muscle in vivo (surface coil) (M$+$ 136, control 142 ms) or in isolated perfused hearts (Helmholtz coil) (control 83, M e $ 92 ms). 3'P NMR is severely limited in its ability to detect dietary magnesium depletion in vivo. Measurement of j3-ATP shift in brain may allow studies of the effects of interaction in group studies but does not allow prediction of an individual magnesium status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multiple reaction monitoring mass spectrometric assay for the quantification of PYY in human plasma has been developed. A two stage sample preparation protocol was employed in which plasma containing the full length neuropeptide was first digested using trypsin, followed by solid-phase extraction to extract the digested peptide from the complex plasma matrix. The peptide extracts were analysed by LC-MS using multiple reaction monitoring to detect and quantify PYY. The method has been validated for plasma samples, yielding linear responses over the range 5–1,000 ng mL−1. The method is rapid, robust and specific for plasma PYY detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design-build (DB) system is a popular and effective delivery method of construction projects worldwide. After owners decide to procure their projects through the DB system, they may wish to determine the optimal proportion of design to be provided in the DB request for proposals (RFPs), which serve as solicitations for design-builders and describe the scope of work. However, this presents difficulties to DB owners and there is little, if any, systematic research in this area. This paper reports on an empirical study in the USA entailing both an online questionnaire survey and Delphi survey to identify and evaluate the factors influencing owners’ decisions in determining the proportion of design to include in DB RFPs. Eleven factors are identified, i.e. (1) clarity of project scope; (2) applicability of performance specifications; (3) desire for design innovation; (4) site constraints; (5) availability of competent design-builders; (6) project control requirements; (7) user group involvement level; (8) third party requirements; (9) owner experience with DB; (10) project complexity; and (11) schedule constraints. A statistically significant agreement on the eleven factors was also obtained from the (mainly non-owner) Delphi experts. Although some of the experts hold different opinions on how these factors affect the proportion of design, these findings furnish various stakeholders with a better understanding of the delivery process of DB projects and the appropriate provision of project information in DB RFPs. As the result is mainly industry opinion concerning the optimal proportion of design, in addition and for completeness, future studies should be conducted to obtain a big picture of the optimal proportion of design by means of seeking owners’ inputs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A composite SaaS (Software as a Service) is a software that is comprised of several software components and data components. The composite SaaS placement problem is to determine where each of the components should be deployed in a cloud computing environment such that the performance of the composite SaaS is optimal. From the computational point of view, the composite SaaS placement problem is a large-scale combinatorial optimization problem. Thus, an Iterative Cooperative Co-evolutionary Genetic Algorithm (ICCGA) was proposed. The ICCGA can find reasonable quality of solutions. However, its computation time is noticeably slow. Aiming at improving the computation time, we propose an unsynchronized Parallel Cooperative Co-evolutionary Genetic Algorithm (PCCGA) in this paper. Experimental results have shown that the PCCGA not only has quicker computation time, but also generates better quality of solutions than the ICCGA.