887 resultados para Illinois Senior Volunteer Service Credit Program.
Resumo:
The purpose of this study is to descriptively analyze the current program at Ben Taub Pediatric Weight Management Program in Houston, Texas, a program designed to help overweight children ages three to eighteen to lose weight. In Texas, approximately one in every three children is overweight or obese. Obesity is seen at an even greater level within Ben Taub due to the hospital's high rate of service for underserved minority populations (Dehghan et al, 2005; Tyler and Horner, 2008; Hunt, 2009). The weight management program consists of nutritional, behavioral, physical activity, and medical counseling. Analysis will focus on changes in weight, BMI, cholesterol levels, and blood pressure from 2007–2010 for all participants who attended at least two weight management sessions. Recommendations will be given in response to the results of the data analysis.^
Resumo:
Background. This study was designed to evaluate the effects of the Young Leaders for Healthy Change program, an internet-delivered program in the school setting that emphasized health advocacy skills-development, on nutrition and physical activity behaviors among older adolescents (13–18 years). The program consisted of online curricular modules, training modules, social media, peer and parental support, and a community service project. Module content was developed based on Social Cognitive Theory and known determinants of behavior for older adolescents. ^ Methods. Of the 283 students who participated in the fall 2011 YL program, 38 students participated in at least ten of the 12 weeks and were eligible for this study. This study used a single group-only pretest/posttest evaluation design. Participants were 68% female, 58% white/Caucasian, 74% 10th or 11th graders, and 89% mostly A and/or B students. The primary behavioral outcomes for this analysis were participation in 60-minutes of physical activity per day, 20-minutes of vigorous- or moderate- intensity physical activity (MVPA) participation per day, television and computer time, fruit and vegetable (FV) intake, sugar-sweetened beverage intake, and consumption of breakfast, home-cooked meals, and fast food. Other outcomes included knowledge, beliefs, and attitudes related to healthy eating, physical activity, and advocacy skills. ^ Findings. Among the 38 participants, no significant changes in any variables were observed. However, among those who did not previously meet behavioral goals there was an 89% increase in students who participated in more than 20 minutes of MVPA per day and a 58% increase in students who ate home-cooked meals 5–7 days per week. The majority of participants met program goals related to knowledge, beliefs, and attitudes prior to the start of the program. Participants reported either maintaining or improving to the goal at posttest for all items except FV intake knowledge, taste and affordability of healthy foods, interest in teaching others about being healthy, and ease of finding ways to advocate in the community. ^ Conclusions. The results of this evaluation indicated that promoting healthy behaviors requires different strategies than maintaining healthy behaviors among high school students. In the school setting, programs need to target the promotion and maintenance of health behaviors to engage all students who participate in the program as part of a class or club activity. Tailoring the program using screening and modifying strategies to meet the needs of all students may increase the potential reach of the program. The Transtheoretical Model may provide information on how to develop a tailored program. Additional research on how to utilize the constructs of TTM effectively among high school students needs to be conducted. Further evaluation studies should employ a more expansive evaluation to assess the long-term effectiveness of health advocacy programming.^
Resumo:
While most professionals do not dispute the fact that evaluation is necessary to determine whether agencies and practitioners are truly providing services that meet clients’ needs, information regarding consistent measures on service effectiveness in human service organizations is sparse. A national survey of 250 not-for-profit family service organizations in the United States (52.8% return rate) yielded results relevant to client identified needs and agency effectiveness measures in serving today’s families. On an open-ended survey item, 52.3% agencies indicated that poverty represented the most pressing problem among today’s families because other psychological needs also take priority. Over two thirds of these agencies used multiple methods to evaluate their services. Clients’ feedback and outcome measures are the most popular methods. The findings reveal agencies' difficulties in determining what or who decides if the most appropriate services are being provided for the target population. Limited data collected on outcomes and impact may impose additional difficulties in program design and planning.
Resumo:
Parent partner mentoring programs are an innovative strategy for child welfare agencies to engage families in case planning and service delivery. These programs recruit and train parents who have been involved in the system and have successfully resolved identified child abuse or neglect issues to work with families with current open cases in the child welfare system. Parent partner mentors can provide social and emotional support, advocacy, and practical advice for navigating this challenging system. Insofar as parent partners share similar experiences, and cultural and socioeconomic characteristics of families, they may be more successful in engaging families and building trusting supportive relationships. The current study presents qualitative data from interviews and case studies of families who were matched with a parent partner in a large county in a Midwestern state. Interviews with families, parent partner mentors, child welfare agency staff, and community partners and providers suggest that parent partner programs may be just as beneficial for parent partner mentors as they are for families being mentored. These programs can build professional skills, help improve self-esteem, provide an avenue for social support, and may potentially prevent recidivism. Parent Partner programs also provide a mechanism for amplifying family voice at all levels of the agency.
Resumo:
The purpose of this study was to evaluate the effectiveness of an HIV-screening program at a private health-care institution where the providers were trained to counsel pregnant women about the HIV-antibody test according to the latest recommendations made by the U.S. Public Health Service (PHS) and the Texas legislature. A before-and-after study design was selected for the study. The participants were OB/GYN nurses who attended an educational program and the patients they counseled about the HIV test. Training improved the nurses' overall knowledge about the content of the program and nurses were more likely to offer the HIV test to all pregnant women regardless of their risk of infection. Still, contrary to what was predicted, the nurses did not give more information to increase the knowledge pregnant women had about HIV infection, transmission, and available treatments. Consequently, many women were not given the chance to correctly assess their risk during the counseling session and there was no evidence that knowledge would reduce the propensity of many women to deny being at risk for HIV. On the other hand, pregnant women who received prenatal care after the implementation of the HIV-screening program were more likely to be tested than women who received prenatal care before its implementation (96% vs. 48%); in turn, the likelihood that more high-risk women would be tested for HIV also increased (94% vs. 60%). There was no evidence that mandatory testing with right of refusal would deter women from being tested for HIV. When the moment comes for a woman to make her decision, other concerns are more important to her than whether the option to be tested is mandatory or not. The majority of pregnant women indicated that their main reasons for being tested were: (a) the recommendation of their health-care provider; and (b) concern about the risks to their babies. Recommending that all pregnant women be tested regardless of their risk of infection, together with making the HIV test readily available to all women, are probably the two best ways of increasing the patients' participation in an HIV-screening program for pregnant women. ^
Resumo:
Improving energy efficiency is an unarguable emergent issue in developing economies and an energy efficiency standard and labeling program is an ideal mechanism to achieve this target. However, there is concern regarding whether the consumers will choose the highly energy efficient appliances because of its high price in consequence of the high cost. This paper estimates how the consumer responds to introduction of the energy efficiency standard and labeling program in China. To quantify evaluation by consumers, we estimated their consumer surplus and the benefits of products based on the estimated parameters of demand function. We found the following points. First, evaluation of energy efficiency labeling by the consumer is not monotonically correlated with the number of grades. The highest efficiency label (Label 1) is not evaluated to be no less higher than labels 2 and 3, and is sometimes lower than the least energy efficient label (Label UI). This goes against the design of policy intervention. Second, several governmental policies affects in mixed directions: the subsidies for energy saving policies to the highest degree of the labels contribute to expanding consumer welfare as the program was designed. However, the replacement for new appliances policies decreased the welfare.
Resumo:
Compile-time program analysis techniques can be applied to Web service orchestrations to prove or check various properties. In particular, service orchestrations can be subjected to resource analysis, in which safe approximations of upper and lower resource usage bounds are deduced. A uniform analysis can be simultaneously performed for different generalized resources that can be directiy correlated with cost- and performance-related quality attributes, such as invocations of partners, network traffic, number of activities, iterations, and data accesses. The resulting safe upper and lower bounds do not depend on probabilistic assumptions, and are expressed as functions of size or length of data components from an initiating message, using a finegrained structured data model that corresponds to the XML-style of information structuring. The analysis is performed by transforming a BPEL-like representation of an orchestration into an equivalent program in another programming language for which the appropriate analysis tools already exist.
Resumo:
The focus of this paper is to outline the main structure of an alternative solution to implement a Software Process Improvement program in Small-Settings using the outsourcing infrastructure. This solution takes the advantages of the traditional outsourcing models and applies its structure to propose an alternative solution to make available a Software Process Improvement program for Small-Settings. With this outsourcing solution it is possible share the resources between several Small-Settings.
Resumo:
Telecommunications networks have been always expanding and thanks to it, new services have appeared. The old mechanisms for carrying packets have become obsolete due to the new service requirements, which have begun working in real time. Real time traffic requires strict service guarantees. When this traffic is sent through the network, enough resources must be given in order to avoid delays and information losses. When browsing through the Internet and requesting web pages, data must be sent from a server to the user. If during the transmission there is any packet drop, the packet is sent again. For the end user, it does not matter if the webpage loads in one or two seconds more. But if the user is maintaining a conversation with a VoIP program, such as Skype, one or two seconds of delay in the conversation may be catastrophic, and none of them can understand the other. In order to provide support for this new services, the networks have to evolve. For this purpose MPLS and QoS were developed. MPLS is a packet carrying mechanism used in high performance telecommunication networks which directs and carries data using pre-established paths. Now, packets are forwarded on the basis of labels, making this process faster than routing the packets with the IP addresses. MPLS also supports Traffic Engineering (TE). This refers to the process of selecting the best paths for data traffic in order to balance the traffic load between the different links. In a network with multiple paths, routing algorithms calculate the shortest one, and most of the times all traffic is directed through it, causing overload and packet drops, without distributing the packets in the other paths that the network offers and do not have any traffic. But this is not enough in order to provide the real time traffic the guarantees it needs. In fact, those mechanisms improve the network, but they do not make changes in how the traffic is treated. That is why Quality of Service (QoS) was developed. Quality of service is the ability to provide different priority to different applications, users, or data flows, or to guarantee a certain level of performance to a data flow. Traffic is distributed into different classes and each of them is treated differently, according to its Service Level Agreement (SLA). Traffic with the highest priority will have the preference over lower classes, but this does not mean it will monopolize all the resources. In order to achieve this goal, a set policies are defined to control and alter how the traffic flows. Possibilities are endless, and it depends in how the network must be structured. By using those mechanisms it is possible to provide the necessary guarantees to the real-time traffic, distributing it between categories inside the network and offering the best service for both real time data and non real time data. Las Redes de Telecomunicaciones siempre han estado en expansión y han propiciado la aparición de nuevos servicios. Los viejos mecanismos para transportar paquetes se han quedado obsoletos debido a las exigencias de los nuevos servicios, que han comenzado a operar en tiempo real. El tráfico en tiempo real requiere de unas estrictas garantías de servicio. Cuando este tráfico se envía a través de la red, necesita disponer de suficientes recursos para evitar retrasos y pérdidas de información. Cuando se navega por la red y se solicitan páginas web, los datos viajan desde un servidor hasta el usuario. Si durante la transmisión se pierde algún paquete, éste se vuelve a mandar de nuevo. Para el usuario final, no importa si la página tarda uno o dos segundos más en cargar. Ahora bien, si el usuario está manteniendo una conversación usando algún programa de VoIP (como por ejemplo Skype) uno o dos segundos de retardo en la conversación podrían ser catastróficos, y ninguno de los interlocutores sería capaz de entender al otro. Para poder dar soporte a estos nuevos servicios, las redes deben evolucionar. Para este propósito se han concebido MPLS y QoS MPLS es un mecanismo de transporte de paquetes que se usa en redes de telecomunicaciones de alto rendimiento que dirige y transporta los datos de acuerdo a caminos preestablecidos. Ahora los paquetes se encaminan en función de unas etiquetas, lo cual hace que sea mucho más rápido que encaminar los paquetes usando las direcciones IP. MPLS también soporta Ingeniería de Tráfico (TE). Consiste en seleccionar los mejores caminos para el tráfico de datos con el objetivo de balancear la carga entre los diferentes enlaces. En una red con múltiples caminos, los algoritmos de enrutamiento actuales calculan el camino más corto, y muchas veces el tráfico se dirige sólo por éste, saturando el canal, mientras que otras rutas se quedan completamente desocupadas. Ahora bien, esto no es suficiente para ofrecer al tráfico en tiempo real las garantías que necesita. De hecho, estos mecanismos mejoran la red, pero no realizan cambios a la hora de tratar el tráfico. Por esto es por lo que se ha desarrollado el concepto de Calidad de Servicio (QoS). La calidad de servicio es la capacidad para ofrecer diferentes prioridades a las diferentes aplicaciones, usuarios o flujos de datos, y para garantizar un cierto nivel de rendimiento en un flujo de datos. El tráfico se distribuye en diferentes clases y cada una de ellas se trata de forma diferente, de acuerdo a las especificaciones que se indiquen en su Contrato de Tráfico (SLA). EL tráfico con mayor prioridad tendrá preferencia sobre el resto, pero esto no significa que acapare la totalidad de los recursos. Para poder alcanzar estos objetivos se definen una serie de políticas para controlar y alterar el comportamiento del tráfico. Las posibilidades son inmensas dependiendo de cómo se quiera estructurar la red. Usando estos mecanismos se pueden proporcionar las garantías necesarias al tráfico en tiempo real, distribuyéndolo en categorías dentro de la red y ofreciendo el mejor servicio posible tanto a los datos en tiempo real como a los que no lo son.
Resumo:
La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.
Resumo:
This study examines the concept of engagement in samples of volunteers from different non-profit organisations. Study 1 analyzes the psychometric properties of the abbreviated version of the Utrecht Work Engagement Scale (UWES) (Schaufeli, Bakker, & Salanova, 2006a). Two factorial structures are examined: one-dimensional and three-dimensional structures. Based on the Three-Stage Model of Volunteers’ Duration of Service (Chacón, Vecina, & Dávila, 2007), Study 2 investigates the relationship between engagement, volunteer satisfaction, and intention to remain in a sample of new volunteers and the relationship between engagement, organisational commitment, and intention to remain in a sample of veteran volunteers. Moderated mediation analysis is provided using duration of service as a moderator in order to set a splitting point between new and veteran volunteers. The results of the confirmatory factor analysis suggest that the three-factor model fits better to the data. Regarding the structural models, the first one shows that engagement is crucial to volunteer satisfaction during the first stage, while volunteer satisfaction is the key variable in explaining intention to continue. The second structural model shows that engagement reinforces the participant’s commitment to the organisation, while organizational commitment predicts intention to continue. Both models demonstrate a notable decline when samples are changed.
Resumo:
Although initially conceived as providing simply the preventive portion of an extended continuum of care for veterans, the Driving Under the Influence (DUI) program has turned out to be an important outreach service for active duty or recently discharged OEF/OIF (Operation Enduring Freedom/Operation Iraqi Freedom) veterans. Veterans receive empirically-based, state-mandated education and therapy under the only Department of Veterans Affairs (VA) - sponsored DUI program in the State of Colorado, with the advantage of having providers who are sensitive to symptoms of Post-Traumatic Stress Disorder (PTSD) and other relevant diagnoses specific to this population, including Traumatic Brain Injury (TBI). In this paper, the rapid growth of this program is described, as well as summary data regarding the completion, discontinuation, and augmentation of services from the original referral concern. Key results indicated that for nearly one third (31.9%) of the OEF/OIF veterans who were enrolled in the DUI program, this was their initial contact with the VA health care system. Furthermore, following their enrollment in the DUI program, more than one fourth (27.6%) were later referred to and attended other VA programs including PTSD rehabilitation and group therapy, anger management, and intensive inpatient or outpatient dual diagnosis programs. These and other findings from this study suggest that the DUI program may be an effective additional pathway for providing treatment that is particularly salient to the distinctive OEF/OIF population; one that may also result in earlier intervention for problem drinking and other problems related to combat. Relevant conclusions discussed herein primarily aim to improve providers' understanding of effective outreach, and to enhance the appropriate linkages between OEF/OIF veterans and existing VA services.