846 resultados para Exploratory analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One commonality across the leadership and knowledge related literature is the apparent neglect of the leaders own knowledge. This thesis sought to address this issue through conducting exploratory research into the content of leader’s personal knowledge and the process of knowing it. The empirical inquiry adopted a longitudinal approach, with interviews conducted at two separate time periods with an extended time-interval between each. The findings from this research contrast with images of leadership which suggest leaders are in control of what they know, that they own their own knowledge. The picture that emerges is one of individuals struggling to keep abreast of the knowledge required to deal with the dynamics and uncertainties of organisational life. Much knowledge is tacit, provisional and perishable and the related process of knowing more organic, evolutionary and informal than any structured or orchestrated approach. The collective nature of knowing is a central feature, with these leaders embedded in networks of uncontrollable relationships. In view of the indeterminate nature of knowing, the boundary between what is known and what one needs to know is both amorphous and ephemeral, and the likelihood of knowledge-absences is escalated. A significant finding in this regard is the identification of two critical points where not-knowing is most likely (entry and exit from role) and the differing implications of each. Overtime the knowledge that is legitimised or prioritised is significantly altered as these leaders replace the dogmas that were previously held in high esteem with the lessons from their own experience. This experience brings increased self-knowledge and a deeper appreciation of the values and morals instilled in their early lives. In view of the above findings, this study makes theoretical contribution to a number of core literatures: authentic leadership, role transition and knowledge-absences. In terms of leadership development, the findings point to the necessity to prepare leaders for the challenges they will encounter at the pivotal stages of the leadership role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neophobia, the fear of novelty, is a behavioral trait found across a number of animal species, including humans. Neophobic individuals perceive novel environments and stimuli to have aversive properties, and exhibit fearful behaviors when presented with non-familiar situations. The present study examined how early life exposure to aversive novel stimuli could reduce neophobia in bobwhite quail chicks. Experiment 1 exposed chicks to a novel auditory tone previously shown to be aversive to naïve chicks (Suarez, 2012) for 24 hours immediately after hatching, then subsequently tested them in the presence of the tone within a novel maze task. Postnatally exposed chicks demonstrated decreased fearfulness compared to naïve chicks, and behaved more similarly to chicks tested in the presence of a known attractive auditory stimulus (a bobwhite maternal assembly call vocalization). Experiment 2 exposed chicks to the novel auditory tone for 24 hours prenatally, then subsequently tested them within a novel maze task. Prenatally exposed chicks showed decreased fearfulness to a similar degree as those postnatally exposed, revealing that both prenatal and postnatal exposure methods are capable of decreasing fear of auditory stimuli. Experiment 3 exposed chicks to a novel visual stimulus for 24 hours postnatally, then subsequently tested them within a novel emergence box / T-maze apparatus. Chicks exposed to the visual stimulus showed decreased fearfulness compared to naïve chicks, thereby demonstrating the utility of this method across sense modalities. Experiment 4 assessed whether early postnatal exposure to one novel stimulus could generalize and serve to decrease fear of novelty when chicks were tested in the presence of markedly different stimuli. By combining the methods of Experiments 1 and 3, this experiment revealed that chicks exposed to one type of stimulus (auditory or visual) demonstrated decreased fear when subsequently tested in the presence of the opposite type of novel stimulus. These results suggest that experience with novel stimuli can moderate the extent to which neophobia will develop during early development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cluster provides a greater commercial relationship between the companies that comprise it. This encourages companies to adopt competitive structures that allow solving problems that would hardly alone (Lubeck et. Al., 2011). With that this paper aims to describe the coopetition between companies operating on a commercial cluster planned, from the point of view of retailers, taking as a basis the theoretical models proposed by Bengtsson and Kock (1999) and Leon (2005) and operationalized by means of Social Network Analysis (SNA). Data collection consisted of two phases, the first exploratory aspect to identify the actors, and the second was characterized as descriptive as it aims to describe the coopetition among the enterprises. As a result we identified the companies that cooperate and compete simultaneously (coopetition), firms that only compete, companies just cooperate and businesses that do not compete and do not cooperate (coexistence).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The objective of this exploratory study is to investigate the “flow-through” or relationship between top-line measures of hotel operating performance (occupancy, average daily rate and revenue per available room) and bottom-line measures of profitability (gross operating profit and net operating income), before and during the recent great recession. Design/methodology/approach – This study uses data provided by PKF Hospitality Research for the period from 2007-2009. A total of 714 hotels were analyzed and various top-line and bottom-line profitability changes were computed using both absolute levels and percentages. Multiple regression analysis was used to examine the relationship between top and bottom line measures, and to derive flow-through ratios. Findings – The results show that average daily rate (ADR) and occupancy are significantly and positively related to gross operating profit per available room (GOPPAR) and net operating income per available room (NOIPAR). The evidence indicates that ADR, rather than occupancy, appears to be the stronger predictor and better measure of RevPAR growth and bottom-line profitability. The correlations and explained variances are also higher than those reported in prior research. Flow-through ratios range between 1.83 and 1.91 for NOIPAR, and between 1.55 and 1.65 for GOPPAR, across all chain-scales. Research limitations/implications – Limitations of this study include the limited number of years in the study period, limited number of hotels in a competitive set, and self-selection of hotels by the researchers. Practical implications – While ADR and occupancy work in combination to drive profitability, the authors' study shows that ADR is the stronger predictor of profitability. Hotel managers can use flow-through ratios to make financial forecasts, or use them as inputs in valuation models, to forecast future profitability. Originality/value – This paper extends prior research on the relationship between top-line measures and bottom-line profitability and serves to inform lodging owners, operators and asset managers about flow-through ratios, and how these ratios impact hotel profitability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to explore the link between decentralization and the impact of natural disasters through empirical analysis. It addresses the issue of the importance of the role of local government in disaster response through different means of decentralization. By studying data available for 50 countries, it allows to develop the knowledge on the role of national government in setting policy that allows flexibility and decision making at a local level and how this devolution of power influences the outcome of disasters. The study uses Aaron Schneider’s definition and rankings of decentralization, the EM-DAT database to identify the amount of people affected by disasters on average per year as well as World Bank Indicators and the Human Development Index (HDI) to model the role of local decentralization in mitigating disasters. With a multivariate regression it looks at the amount of affected people as explained by fiscal, administrative and political decentralization, government expenses, percentage of urbanization, total population, population density, the HDI and the overall Logistics Performance Indicator (LPI). The main results are that total population, the overall LPI and fiscal decentralization are all significant in relation to the amount of people affected by disasters for the countries and period studied. These findings have implication for government’s policies by indicating that fiscal decentralization by allowing local governments to control a bigger proportion of the countries revenues and expenditures plays a role in reducing the amount of affected people in disasters. This can be explained by the fact that local government understand their own needs better in both disaster prevention and response which helps in taking the proper decisions to mitigate the amount of people affected in a disaster. The reduction in the implication of national government might also play a role in reducing the time of reaction to face a disaster. The main conclusion of this study is that fiscal control by local governments can help reduce the amount of people affected by disasters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement of marine algal toxins has traditionally focussed on shellfish monitoring while, over the last decade, passive sampling has been introduced as a complementary tool for exploratory studies. Since 2011, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been adopted as the EU reference method (No.15/2011) for detection and quantitation of lipophilic toxins. Traditional LC-MS approaches have been based on low-resolution mass spectrometry (LRMS), however, advances in instrument platforms have led to a heightened interest in the use of high-resolution mass spectrometry (HRMS) for toxin detection. This work describes the use of HRMS in combination with passive sampling as a progressive approach to marine algal toxin surveys. Experiments focused on comparison of LRMS and HRMS for determination of a broad range of toxins in shellfish and passive samplers. Matrix effects are an important issue to address in LC-MS; therefore, this phenomenon was evaluated for mussels (Mytilus galloprovincialis) and passive samplers using LRMS (triple quadrupole) and HRMS (quadrupole time-of-flight and Orbitrap) instruments. Matrix-matched calibration solutions containing okadaic acid and dinophysistoxins, pectenotoxin, azaspiracids, yessotoxins, domoic acid, pinnatoxins, gymnodimine A and 13-desmethyl spirolide C were prepared. Similar matrix effects were observed on all instruments types. Most notably, there was ion enhancement for pectenotoxins, okadaic acid/dinophysistoxins on one hand, and ion suppression for yessotoxins on the other. Interestingly, the ion selected for quantitation of PTX2 also influenced the magnitude of matrix effects, with the sodium adduct typically exhibiting less susceptibility to matrix effects than the ammonium adduct. As expected, mussel as a biological matrix, quantitatively produced significantly more matrix effects than passive sampler extracts, irrespective of toxin. Sample dilution was demonstrated as an effective measure to reduce matrix effects for all compounds, and was found to be particularly useful for the non-targeted approach. Limits of detection and method accuracy were comparable between the systems tested, demonstrating the applicability of HRMS as an effective tool for screening and quantitative analysis. HRMS offers the advantage of untargeted analysis, meaning that datasets can be retrospectively analysed. HRMS (full scan) chromatograms of passive samplers yielded significantly less complex data sets than mussels, and were thus more easily screened for unknowns. Consequently, we recommend the use of HRMS in combination with passive sampling for studies investigating emerging or hitherto uncharacterised toxins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the functional status of elderly residents in long-term institutions. Methods: Exploratory-descriptive study, developed in two long-term care institutions for the elderly (LTC), in city of Fortaleza, Ceará. The instruments utilized were: 1) Sociodemographic form, 2) Functional Independence Measure (FIM), and 3) International Classification of Functioning (ICF). Data was descriptively analyzed through the calculation of frequency, mean and standard deviation. Results: There was a predominance of males (n=47; 59.49%), with mean age of 74.58 (± 8.89) years, 68.35% (n=54) have been or are married, and 49.37% (n=39) are illiterate. In reference to the FIM, it was observed that the elderly perform the activities in a complete or modified mode and 18.99% (n=15) have difficulty climbing stairs. As to the association between the FIM and the ICF, in relation to self-care, it was seen that 96.20% (n=76) have no difficulty in performing tasks; 92.40% (n=73) move around without difficulty; and 98.73% (n=78) have preserved the cognition. In relation to the capacity of maintaining and controlling social interactions, all exhibit this domain preserved. Conclusion: The surveyed elderly presented good cognitive status and little dependence in activities regarding personal care, mobility and communication. The use of the ICF allows the visualization of the functionality scenario among the elderly, what can facilitate more effective health promotion strategies for this population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: In recent years, Internet access has grown markedly providing individuals with new opportunities for online information retrieval, psychological advice and support. The objectives of the present study were to explore the context through which dentally anxious individuals access an online support group and the nature of their online experiences. Methods: An online questionnaire was completed by 143 individuals who accessed the Dental Fear Central online support group bulletin board. Qualitative analysis was conducted on the responses. Results: Analysis revealed three emergent themes which reflected the motives and experiences of individuals: ‘Searching for help’, ‘Sharing fears’ and ‘I feel empowered’. Conclusion: This exploratory study suggests that for most individuals accessing this online support group was a positive and beneficial experience. Practice Implications: Online support groups may represent a convenient and beneficial tool that may assist certain individuals to confront their debilitating anxiety/phobia and successfully receive dental care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concepts of smart city and social innovation in combination with the increasing use of ICT by citizens and public authorities could enhance the involvement of people on the decisions that directly affect their daily life. A case study approach was adopted to illustrate the potential of civic crowdfunding for increasing the participation and collaboration between citizens, firms and government. The analysis of two exemplary cases shows that civic crowdfunding platforms could be used by public administration to engage communities in the search of solutions to local problems. Likewise, it could be used to reinforce the community ties and to leverage the bonds among the stakeholders and the partners of the community ecosystem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The Objective for this study was to explore women’s perceptions of and satisfaction with nursing care they received following stillbirth and neonatal death in villages around a community hospital in Lilongwe. Methods This qualitative, exploratory study through a mixture of purposive and snowball sampling, recruited 20 women who had lost a child through stillbirth or neonatal death in the past 2 years. Data were collected through semi-structured interviews in the privacy of the homes of the women. All interviews were tape-recorded and transcribed verbatim and were analyzed using thematic analysis. Results Almost half of the respondents expressed satisfaction with the way nurses cared for them after experiencing perinatal loss, although some felt unable to comment on the quality of care received. However, several bereaved women were dissatisfied with how nurses handled their loss. They noted nurses not providing attention or explanations and some even attributed the death of their child to nurses’ neglect. Conclusions Interventions are needed which foster awareness where nurses become more sensitive to the mothers’ emotional needs in an equally sensitive health care system. There is also need for more research into care provided following perinatal deaths in resource-poor settings to increase the evidence-base for informed and improved care for women who have experienced child loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the framework of research on students' active performance in their study habits, the aim of this study is to analyze a model predicting the effect of social identity and personal initiative on engagement in university students. We conducted a cross-sectional study on 266 students from different Spanish universities. The resulting data were analyzed using SPSS Macro MEDIATE. Evidence was found for the proposed model. Only group-identity predicted personal initiative and engagement. Analysis revealed the mediating role of proactive behavior on engagement in university students. It is concluded that the university management may intervene, from an organizational-culture approach, promoting guidelines to reinforce students' sense of belonging by enhancing initiative and autonomous problem solving in learning behaviors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.