898 resultados para Election Counting and Reporting Software,


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mobile phones have the potential of fostering political mobilisation. There is a significant political power in mobile technology. Like the Internet, mobile phones facilitate communication and rapid access to information. Compared to the Internet, however, mobile phone diffusion has reached a larger proportion of the population in most countries, and thus the impact of this new medium is conceivably greater. There are now more mobile phones in the UK than there are people (averaging at 121 mobile phones for every 100 people). In this paper, the attempt to use modern mobile technology to handle the General Election, is discussed. The pre-election advertising, election day issues, including the election news and results as they come in, and answering questions via text message regarding the results of current and/or previous general elections are considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To develop a decision support system (DSS), myGRaCE, that integrates service user (SU) and practitioner expertise about mental health and associated risks of suicide, self-harm, harm to others, self-neglect, and vulnerability. The intention is to help SUs assess and manage their own mental health collaboratively with practitioners. Methods: An iterative process involving interviews, focus groups, and agile software development with 115 SUs, to elicit and implement myGRaCE requirements. Results: Findings highlight shared understanding of mental health risk between SUs and practitioners that can be integrated within a single model. However, important differences were revealed in SUs' preferred process of assessing risks and safety, which are reflected in the distinctive interface, navigation, tool functionality and language developed for myGRaCE. A challenge was how to provide flexible access without overwhelming and confusing users. Conclusion: The methods show that practitioner expertise can be reformulated in a format that simultaneously captures SU expertise, to provide a tool highly valued by SUs. A stepped process adds necessary structure to the assessment, each step with its own feedback and guidance. Practice Implications: The GRiST web-based DSS (www.egrist.org) links and integrates myGRaCE self-assessments with GRiST practitioner assessments for supporting collaborative and self-managed healthcare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer software plays an important role in business, government, society and sciences. To solve real-world problems, it is very important to measure the quality and reliability in the software development life cycle (SDLC). Software Engineering (SE) is the computing field concerned with designing, developing, implementing, maintaining and modifying software. The present paper gives an overview of the Data Mining (DM) techniques that can be applied to various types of SE data in order to solve the challenges posed by SE tasks such as programming, bug detection, debugging and maintenance. A specific DM software is discussed, namely one of the analytical tools for analyzing data and summarizing the relationships that have been identified. The paper concludes that the proposed techniques of DM within the domain of SE could be well applied in fields such as Customer Relationship Management (CRM), eCommerce and eGovernment. ACM Computing Classification System (1998): H.2.8.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software development is an extremely complex process, during which human errors are introduced and result in faulty software systems. It is highly desirable and important that these errors can be prevented and detected as early as possible. Software architecture design is a high-level system description, which embodies many system features and properties that are eventually implemented in the final operational system. Therefore, methods for modeling and analyzing software architecture descriptions can help prevent and reveal human errors and thus improve software quality. Furthermore, if an analyzed software architecture description can be used to derive a partial software implementation, especially when the derivation can be automated, significant benefits can be gained with regard to both the system quality and productivity. This dissertation proposes a framework for an integrated analysis on both of the design and implementation. To ensure the desirable properties of the architecture model, we apply formal verification by using the model checking technique. To ensure the desirable properties of the implementation, we develop a methodology and the associated tool to translate an architecture specification into an implementation written in the combination of Arch-Java/Java/AspectJ programming languages. The translation is semi-automatic so that many manual programming errors can be prevented. Furthermore, the translation inserting monitoring code into the implementation such that runtime verification can be performed, this provides additional assurance for the quality of the implementation. Moreover, validations for the translations from architecture model to program are provided. Finally, several case studies are experimented and presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This qualitative two-site case study examined the capacity building practices that Children’s Services Councils (CSCs), independent units of local government, provide to nonprofit organizations (NPOs) contracted to deliver human services. The contracting literature is replete with recommendations for government to provide capacity building to contracted NPOs, yet there is a dearth of scholarship on this topic. The study’s purpose was to increase the understanding of capacity building provided in a local government contracting setting. Data collection consisted primarily of in-depth interviews and focus groups with 73 staff from two CSCs and 28 contracted NPOs. Interview data were supplemented by participant observation and review of secondary data. The study analyzed capacity building needs, practices, influencing factors, and outcomes. The study identified NPO capacity building needs in: documentation and reporting, financial management, program monitoring and evaluation, participant recruitment and retention, and program quality. Additionally, sixteen different types of CSC capacity building practices were identified. Results indicated that three major factors impacted CSC capacity building: CSC capacity building goals, the relationship between the CSC and NPOs, and the level of NPO participation. Study results also provided insight into the dynamics of the CSC capacity building process, including unique problems, challenges, and opportunities as well as necessary resources. The results indicated that the CSCs’ relational contracting approach facilitated CSC capacity building and that CSC contract managers were central players in the process. The study provided evidence that local government agencies can serve as effective builders of NPO capacity. Additionally, results indicated that much of what is known about capacity building can be applied in this previously unstudied capacity building setting. Finally, the study laid the groundwork for future development of a model for capacity building in a local government contracting setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Engineering is one of the most widely researched areas of Computer Science. The ability to reuse software, much like reuse of hardware components is one of the key issues in software development. The object-oriented programming methodology is revolutionary in that it promotes software reusability. This thesis describes the development of a tool that helps programmers to design and implement software from within the Smalltalk Environment (an Object- Oriented programming environment). The ASDN tool is part of the PEREAM (Programming Environment for the Reuse and Evolution of Abstract Models) system, which advocates incremental development of software. The Asdn tool along with the PEREAM system seeks to enhance the Smalltalk programming environment by providing facilities for structured development of abstractions (concepts). It produces a document that describes the abstractions that are developed using this tool. The features of the ASDN tool are illustrated by an example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this Thesis work is to study the multi-frequency properties of the Ultra Luminous Infrared Galaxy (ULIRG) IRAS 00183-7111 (I00183) at z = 0.327, connecting ALMA sub-mm/mm observations with those at high energies in order to place constraints on the properties of its central power source and verify whether the gas traced by the CO may be responsible for the obscuration observed in X-rays. I00183 was selected from the so-called Spoon diagnostic diagram (Spoon et al. 2007) for mid-infrared spectra of infrared galaxies based on the equivalent width of the 6.2 μm Polycyclic Aromatic Hydrocarbon (PAH) emission feature versus the 9.7 μm silicate strength. Such features are a powerful tool to investigate the contribution of star formation and AGN activity in this class of objects. I00183 was selected from the top-left region of the plot where the most obscured sources, characterized by a strong Si absorption feature, are located. To link the sub-mm/mm to the X-ray properties of I00183, ALMA archival Cycle 0 data in Band 3 (87 GHz) and Band 6 (270 GHz) have been calibrated and analyzed, using CASA software. ALMA Cycle 0 was the Early Science program for which data reprocessing is strongly suggested. The main work of this Thesis consisted in reprocessing raw data to provide an improvement with respect to the available archival products and results, which were obtained using standard procedures. The high-energy data consists of Chandra, XMM-Newton and NuSTAR observations which provide a broad coverage of the spectrum in the energy range 0.5 − 30 keV. Chandra and XMM archival data were used, with an exposure time of 22 and 22.2 ks, respectively; their reduction was carried out using CIAO and SAS software. The 100 ks NuSTAR are still private and the spectra were obtained by courtesy of the PI (K. Iwasawa). A detailed spectral analysis was done using XSPEC software; the spectral shape was reproduced starting from simple phenomenological models, and then more physical models were introduced to account for the complex mechanisms that involve this source. In Chapter 1, an overview of the scientific background is discussed, with a focus on the target, I00183, and the Spoon diagnostic diagram, from which it was originally selected. In Chapter 2, the basic principles of interferometry are briefly introduced, with a description of the calibration theory applied to interferometric observations. In Chapter 3, ALMA and its capabilities, both current and future, are shown, explaining also the complex structure of the ALMA archive. In Chapter 4, the calibration of ALMA data is presented and discussed, showing also the obtained imaging products. In Chapter 5, the analysis and discussion of the main results obtained from ALMA data is presented. In Chapter 6, the X-ray observations, data reduction and spectral analysis are reported, with a brief introduction to the basic principle of X-ray astronomy and the instruments from which the observations were carried out. Finally, the overall work is summarized, with particular emphasis on the main obtained results and the possible future perspectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we investigate voter volatility and analyze the causes and motives of switching vote intentions. We test two main sets of variables linked to volatility in literature; political sophistication and ‘political (dis)satisfaction’. Results show that voters with low levels of political efficacy tend to switch more often, both within a campaign and between elections. In the analysis we differentiate between campaign volatility and inter-election volatility and by doing so show that the dynamics of a campaign have a profound impact on volatility. The campaign period is when the lowly sophisticated switch their vote intention. Those with higher levels of interest in politics have switched their intention before the campaign has started. The data for this analysis are from the three wave PartiRep Belgian Election Study (2009).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the cell therapy industry continuing to grow, the ability to preserve clinical grade cells, including mesenchymal stem cells (MSCs), whilst retaining cell viability and function remains critical for the generation of off-the-shelf therapies. Cryopreservation of MSCs, using slow freezing, is an established process at lab scale. However, the cytotoxicity of cryoprotectants, like Me2SO, raises questions about the impact of prolonged cell exposure to cryoprotectant at temperatures >0 °C during processing of large cell batches for allogenic therapies prior to rapid cooling in a controlled rate freezer or in the clinic prior to administration. Here we show that exposure of human bone marrow derived MSCs to Me2SO for ≥1 h before freezing, or after thawing, degrades membrane integrity, short-term cell attachment efficiency and alters cell immunophenotype. After 2 h's exposure to Me2SO at 37 °C post-thaw, membrane integrity dropped to ∼70% and only ∼50% of cells retained the ability to adhere to tissue culture plastic. Furthermore, only 70% of the recovered MSCs retained an immunophenotype consistent with the ISCT minimal criteria after exposure. We also saw a similar loss of membrane integrity and attachment efficiency after exposing osteoblast (HOS TE85) cells to Me2SO before, and after, cryopreservation. Overall, these results show that freezing medium exposure is a critical determinant of product quality as process scale increases. Defining and reporting cell sensitivity to freezing medium exposure, both before and after cryopreservation, enables a fair judgement of how scalable a particular cryopreservation process can be, and consequently whether the therapy has commercial feasibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Child maltreatment is underreported in the United States and in North Carolina. In North Carolina and other states, mandatory reporting laws require various professionals to make reports, thereby helping to reduce underreporting of child maltreatment. This study aims to understand why emergency medical services (EMS) professionals may fail to report suspicions of maltreatment despite mandatory reporting policies. METHODS: A web-based, anonymous, voluntary survey of EMS professionals in North Carolina was used to assess knowledge of their agency's written protocols and potential reasons for underreporting suspicion of maltreatment (n=444). Results were based on descriptive statistics. Responses of line staff and leadership personnel were compared using chi-square analysis. RESULTS: Thirty-eight percent of respondents were unaware of their agency's written protocols regarding reporting of child maltreatment. Additionally, 25% of EMS professionals who knew of their agency's protocol incorrectly believed that the report should be filed by someone other than the person with firsthand knowledge of the suspected maltreatment. Leadership personnel generally understood reporting requirements better than did line staff. Respondents indicated that peers may fail to report maltreatment for several reasons: they believe another authority would file the report, including the hospital (52.3%) or law enforcement (27.7%); they are uncertain whether they had witnessed abuse (47.7%); and they are uncertain about what should be reported (41.4%). LIMITATIONS: This survey may not generalize to all EMS professionals in North Carolina. CONCLUSIONS: Training opportunities for EMS professionals that address proper identification and reporting of child maltreatment, as well as cross-agency information sharing, are warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Adolescence is a period of life associated with self-perceptions of negative body image. Physical activity levels are low and screen time levels are also high during this stage. These perceptions and behaviours are associated with poor health outcomes, making research on their determinants important. With adolescent populations, certain groups may be at higher risk of body dissatisfaction than others, and body dissatisfaction may influence individual physical activity and screen time levels. Objectives: The objectives of this thesis were to: 1) describe body image among young Canadians, examining possible health inequalities 2) estimate the strength and significance of associations between body satisfaction, physical activity and screen time, and 3) examine the potential etiological role of biological sex. Methods: Objective 1: The 2013/2014 Health Behaviour in School-aged Children study was employed. Sex-stratified Rao-Scott chi-square analyses were conducted to examine associations between socio-demographic factors and body satisfaction. Objective 2: The 2005/2006 and 2013/2014 cross-sectional and 2006 longitudinal HBSC data sets were used. Sex-stratified modified Poisson regressions were conducted and risk estimates and associated confidence intervals obtained. Results: Objective 1: Among males, being older, of East and Southeast Asian ethnicity, and reporting low SES all were associated with body dissatisfaction. Among females, being older, of Arab and West Asian or African ethnicity, being born in Canada, and reporting low SES were all associated with being body dissatisfied. Objective 2: Cross-sectionally, males who reported ‘too fat’ body dissatisfaction were more likely to be physically inactive. Adolescents of both sexes who reported ‘too fat’ body dissatisfaction were more likely to engage in high levels of screen time. Data from the longitudinal component supported the idea that male ‘too fat’ body dissatisfaction temporally leads to physical inactivity, but showed an inverse relationship between body dissatisfaction and screen time. Conclusions: Objective 1: Future prevention efforts in Canada should target subgroups to effectively help those at greatest risk of body dissatisfaction, and ameliorate potential inequalities at the population level. Objective 2: The presence of these relationships may inform future interventions as part of a multi-factorial etiology, in order to increase physical activity and decrease screen time among youth.