847 resultados para Traditional methodologies


Relevância:

60.00% 60.00%

Publicador:

Resumo:

O objetivo deste trabalho é o desenvolvimento de frameworks de testes automáticos de software. Este tipo de testes normalmente está associado ao modelo evolucionário e às metodologias ágeis de desenvolvimento de software, enquanto que os testes manuais estão relacionados com o modelo em cascata e as metodologias tradicionais. Como tal foi efetuado um estudo comparativo sobre os tipos de metodologias e de testes existentes, para decidir quais os que melhor se adequavam ao projeto e dar resposta à questão "Será que realmente compensa realizar testes (automáticos)?". Finalizado o estudo foram desenvolvidas duas frameworks, a primeira para a implementação de testes funcionais e unitários sem dependências a ser utilizada pelos estagiários curriculares da LabOrders, e a segunda para a implementação de testes unitários com dependências externas de base de dados e serviços, a ser utilizada pelos funcionários da empresa. Nas últimas duas décadas as metodologias ágeis de desenvolvimento de software não pararam de evoluir, no entanto as ferramentas de automação não conseguiram acompanhar este progresso. Muitas áreas não são abrangidas pelos testes e por isso alguns têm de ser feitos manualmente. Posto isto foram criadas várias funcionalidades inovadoras para aumentar a cobertura dos testes e tornar as frameworks o mais intuitivas possível, nomeadamente: 1. Download automático de ficheiros através do Internet Explorer 9 (e versões mais recentes). 2. Análise do conteúdo de ficheiros .pdf (através dos testes). 3. Obtenção de elementos web e respetivos atributos através de código jQuery utilizando a API WebDriver com PHP bindings. 4. Exibição de mensagens de erro personalizadas quando não é possível encontrar um determinado elemento. As frameworks implementadas estão também preparadas para a criação de outros testes (de carga, integração, regressão) que possam vir a ser necessários no futuro. Foram testadas em contexto de trabalho pelos colaboradores e clientes da empresa onde foi realizado o projeto de mestrado e os resultados permitiram concluir que a adoção de uma metodologia de desenvolvimento de software com testes automáticos pode aumentar a produtividade, reduzir as falhas e potenciar o cumprimento de orçamentos e prazos dos projetos das organizações.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kidney renal failure means that one’s kidney have unexpectedlystoppedfunctioning,i.e.,oncechronicdiseaseis exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapiddeteriorationoftherenalfunction,butisoftenreversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis. The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow onetoconsiderincomplete,unknown,and evencontradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Parchment stands for a multifaceted material made from animal skin, which has been used for centuries as a writing support or as bookbinding. Due to the historic value of objects made of parchment, understanding their degradation and their condition is of utmost importance to archives, libraries and museums, i.e., the assessment of parchment degradation is mandatory, although it is hard to do with traditional methodologies and tools for problem solving. Hence, in this work we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks, to evaluate Parchment Degradation and the respective Degree-of-Confidence that one has on such a happening.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'objectiu del projecte ha estat la millora de la qualitat docent de l'assignatura Estructura de Computadors I, impartida a la Facultat d'Informàtica de Barcelona (UPC) dins els estudis d'Enginyeria Informàtica, Enginyeria Tècnica en Informàtica de Sistemes i Enginyeria Tècnica en Informàtica de Gestió. S'ha treballat en quatre línies d'actuació: (i) aplicació de tècniques d'aprenentatge actiu a les classes; (ii) aplicació de tècniques d'aprenentage cooperatiu no presencials; (iii) implantació de noves TIC i adaptació de les ja emprades per tal d'habilitar mecanismes d'autoavaluació i de realimentació de la informació referent a l'avaluació; i (iv) difusió de les experiències derivades de les diferents actuacions. Referent a les dues primeres mesures s'avalua l'impacte de metodologies docents que afavoreixen l'aprenentatge actiu tant de forma presencial com no presencial, obtenint-se clares millores en el rendiment respecte a altres metodologies utilitzades anteriorment enfocades a la realització de classes del tipus magistral, en què únicament es posa a l'abast dels alumnes la documentació de l'assignatura per a què puguin treballar de forma responsable. Les noves metodologies fan especial èmfasi en el treball en grup a classe i la compartició de les experiències fora de classe a través de fòrums de participació. La mesura que ha requerit més esforç en aquest projecte és la tercera, amb el desenvolupament d'un entorn d'interfície web orientat a la correcció automàtica de programes escrits en llenguatge assemblador. Aquest entorn permet l'autoavaluació per part dels alumnes dels exercicis realitzats a l'assignatura, amb obtenció d'informació detallada sobre les errades comeses. El treball realitzat dins d'aquest projecte s'ha publicat en congressos rellevants en l'àrea docent tant a nivell estatal com internacional. El codi font de l'entorn esmentat anteriorment es posa a disposició pública a través d'un enllaç a la web.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Schizotypy is a multidimensional personality construct representing the extension of psychosis-like traits into the general population. Schizotypy has been associated with attenuated expressions of many of the same neuropsychological abnormalities as schizophrenia, including atypical pattern of functional hemispheric asymmetry. Unfortunately, the previous literature on links between schizotypy and hemispheric asymmetry is inconsistent with some research indicating that elevated schizotypy is associated with relative right over left hemisphere shifts, left over right hemisphere shifts, bilateral impairments, or with no hemispheric differences at all. This inconsistency may result from different methodologies, scales, and / or sex proportions between studies. In a within-participant design, we tested for the four possible links between laterality and schizotypy by comparing the relationship between two common self-report measures of multidimensional schizotypy (the O-LIFE questionnaire, and two Chapman scales, magical ideation and physical anhedonia) and performance in two computerized lateralised hemifield paradigms (lexical decision, chimeric face processing) in 80 men and 79 women. Results for the two scales and two tasks did not unequivocally support any of the four possible links. We discuss the possibilities that a link between schizotypy and laterality 1) exists, but is subtle, probably fluctuating, unable to be assessed by traditional methodologies used here; 2) does not exist, or 3) is indirect, mediated by other factors (e.g. stress-responsiveness, handedness, drug use) whose influences need further exploration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper regards the implementation of the QuEChERS method for the analysis by GC-FPD of 53 different pesticides from the organophosphate class, in whole UHT and pasteurized milk. Selectivity, linearity, repeatability, recovery and limits of detection and quantification were evaluated. Of all pesticide recoveries, 51 were considered satisfactory since the values ranged from 70 to 120% with RSD < 20%. The quantification limits ranged from 0.005 to 0.4 mg kg-1. The QuEChERS method was suitable for determination of 52 pesticides, presenting several advantages - quick, cheap, easy, effective, rugged and safe - with regard to other traditional methodologies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by a simplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able to generate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow defining monitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La lectura tradicional y el acercamiento a los niños a la lectura, desde metodologías tradicionales, es una batalla perdida. Por esto, es importante generar hábitos lectores desde imaginarios y lenguajes propios de los niños para no cometer los mismos errores. Esta nueva metodología trasforma el texto original de los cuentos de los escritores clásicos: Robert Louis Stevenson, Nikolás Gogol y Edgar Allan Poe, en una experiencia enriquecedora, relacionándola con diversas categorías para construir conexiones que aumenten el conocimiento y brinden nuevas formas de hacer, actuar y pensar la lectura.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel analytical approach, based on a miniaturized extraction technique, the microextraction by packed sorbent (MEPS), followed by ultrahigh pressure liquid chromatography (UHPLC) separation combined with a photodiode array (PDA) detection, has been developed and validated for the quantitative determination of sixteen biologically active phenolic constituents of wine. In addition to performing routine experiments to establish the validity of the assay to internationally accepted criteria (linearity, sensitivity, selectivity, precision, accuracy), experiments are included to assess the effect of the important experimental parameters on the MEPS performance such as the type of sorbent material (C2, C8, C18, SIL, and M1), number of extraction cycles (extract-discard), elution volume, sample volume, and ethanol content, were studied. The optimal conditions of MEPS extraction were obtained using C8 sorbent and small sample volumes (250 μL) in five extraction cycle and in a short time period (about 5 min for the entire sample preparation step). The wine bioactive phenolics were eluted by 250 μL of the mixture containing 95% methanol and 5% water, and the separation was carried out on a HSS T3 analytical column (100 mm × 2.1 mm, 1.8 μm particle size) using a binary mobile phase composed of aqueous 0.1% formic acid (eluent A) and methanol (eluent B) in the gradient elution mode (10 min of total analysis). The method gave satisfactory results in terms of linearity with r2-values > 0.9986 within the established concentration range. The LOD varied from 85 ng mL−1 (ferulic acid) to 0.32 μg mL−1 ((+)-catechin), whereas the LOQ values from 0.028 μg mL−1 (ferulic acid) to 1.08 μg mL−1 ((+)-catechin). Typical recoveries ranged between 81.1 and 99.6% for red wines and between 77.1 and 99.3% for white wines, with relative standard deviations (RSD) no larger than 10%. The extraction yields of the MEPSC8/UHPLC–PDA methodology were found between 78.1 (syringic acid) and 99.6% (o-coumaric acid) for red wines and between 76.2 and 99.1% for white wines. The inter-day precision, expressed as the relative standard deviation (RSD%), varied between 0.2% (p-coumaric and o-coumaric acids) and 7.5% (gentisic acid) while the intra-day precision between 0.2% (o-coumaric and cinnamic acids) and 4.7% (gallic acid and (−)-epicatechin). On the basis of analytical validation, it is shown that the MEPSC8/UHPLC–PDA methodology proves to be an improved, reliable, and ultra-fast approach for wine bioactive phenolics analysis, because of its capability for determining simultaneously in a single chromatographic run several bioactive metabolites with high sensitivity, selectivity and resolving power within only 10 min. Preliminary studies have been carried out on 34 real whole wine samples, in order to assess the performance of the described procedure. The new approach offers decreased sample preparation and analysis time, and moreover is cheaper, more environmentally friendly and easier to perform as compared to traditional methodologies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article proposes a method for 3D road extraction from a stereopair of aerial images. The dynamic programming (DP) algorithm is used to carry out the optimization process in the object-space, instead of usually doing it in the image-space such as the DP traditional methodologies. This means that road centerlines are directly traced in the object-space, implying that a mathematical relationship is necessary to connect road points in object and image-space. This allows the integration of radiometric information from images into the associate mathematical road model. As the approach depends on an initial approximation of each road, it is necessary a few seed points to coarsely describe the road. Usually, the proposed method allows good results to be obtained, but large anomalies along the road can disturb its performance. Therefore, the method can be used for practical application, although it is expected some kind of local manual edition of the extracted road centerline.