821 resultados para techniques to develop formalisms
Resumo:
Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.
Resumo:
We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.
Resumo:
Older adult computer users often lose track of the mouse cursor and so resort to methods such as shaking the mouse or searching the entire screen to find the cursor again. Hence, this paper describes how a standard optical mouse was modified to include a touch sensor, activated by releasing and touching the mouse, which automatically centers the mouse cursor to the screen, potentially making it easier to find a ‘lost’ cursor. Six older adult computer users and six younger computer users were asked to compare the touch sensitive mouse with cursor centering with two alternative techniques for locating the mouse cursor: manually shaking the mouse and using the Windows sonar facility. The time taken to click on a target after a distractor task was recorded, and results show that centering the mouse was the fastest to use, with a 35% improvement over shaking the mouse. Five out of six older participants ranked the touch sensitive mouse with cursor centering as the easiest to use.
Resumo:
Time series of global and regional mean Surface Air Temperature (SAT) anomalies are a common metric used to estimate recent climate change. Various techniques can be used to create these time series from meteorological station data. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques relative to the reanalysis reference. Kriging techniques provided the smallest errors in estimates of Arctic anomalies and Simple Kriging was often the best kriging method in this study, especially over sea ice. A linear interpolation technique had, on average, Root Mean Square Errors (RMSEs) up to 0.55 K larger than the two kriging techniques tested. Non-interpolating techniques provided the least representative anomaly estimates. Nonetheless, they serve as useful checks for confirming whether estimates from interpolating techniques are reasonable. The interaction of meteorological station coverage with estimation techniques between 1850 and 2011 was simulated using an ensemble dataset comprising repeated individual years (1979-2011). All techniques were found to have larger RMSEs for earlier station coverages. This supports calls for increased data sharing and data rescue, especially in sparsely observed regions such as the Arctic.
Resumo:
Flexibility of information systems (IS) have been studied to improve the adaption in support of the business agility as the set of capabilities to compete more effectively and adapt to rapid changes in market conditions (Glossary of business agility terms, 2003). However, most of work on IS flexibility has been limited to systems architecture, ignoring the analysis of interoperability as a part of flexibility from the requirements. This paper reports a PhD project, which proposes an approach to develop IS with flexibility features, considering some challenges of flexibility in small and medium enterprises (SMEs) such as the lack of interoperability and the agility of their business. The motivation of this research are the high prices of IS in developing countries and the usefulness of organizational semiotics to support the analysis of requirements for IS. (Liu, 2005).
Resumo:
The failing heart is characterized by complex tissue remodelling involving increased cardiomyocyte death, and impairment of sarcomere function, metabolic activity, endothelial and vascular function, together with increased inflammation and interstitial fibrosis. For years, therapeutic approaches for heart failure (HF) relied on vasodilators and diuretics which relieve cardiac workload and HF symptoms. The introduction in the clinic of drugs interfering with beta-adrenergic and angiotensin signalling have ameliorated survival by interfering with the intimate mechanism of cardiac compensation. Current therapy, though, still has a limited capacity to restore muscle function fully, and the development of novel therapeutic targets is still an important medical need. Recent progress in understanding the molecular basis of myocardial dysfunction in HF is paving the way for development of new treatments capable of restoring muscle function and targeting specific pathological subsets of LV dysfunction. These include potentiating cardiomyocyte contractility, increasing cardiomyocyte survival and adaptive hypertrophy, increasing oxygen and nutrition supply by sustaining vessel formation, and reducing ventricular stiffness by favourable extracellular matrix remodelling. Here, we consider drugs such as omecamtiv mecarbil, nitroxyl donors, cyclosporin A, SERCA2a (sarcoplasmic/endoplasmic Ca(2 +) ATPase 2a), neuregulin, and bromocriptine, all of which are currently in clinical trials as potential HF therapies, and discuss novel molecular targets with potential therapeutic impact that are in the pre-clinical phases of investigation. Finally, we consider conceptual changes in basic science approaches to improve their translation into successful clinical applications.
Resumo:
Photodynamic therapy, used mainly for cancer treatment and microorganisms inaction, is based on production of reactive oxygen species by light irradiation of a sensitizer. Hematoporphyrin derivatives as Photofrin (R) (PF) Photogem (R) (PG) and Photosan (R) (PF), and chlorin-c6-derivatives as Photodithazine (R)(PZ), have suitable sensitizing properties. The present study provides a way to make a fast previous evaluation of photosensitizers efficacy by a combination of techniques: a) use of brovine serum albumin and uric acid as chemical dosimeters; b) photo-hemolysis of red blood cells used as a cell membrane interaction model, and c) octanol/phosphate buffer partition to assess the relative lipophilicity of the compounds. The results suggest the photodynamic efficient rankings PZ > PG >= PF > PS. These results agree with the cytotoxicity of the photosensitizers as well as to chromatographic separation of the HpDs, both performed in our group, showing that the more lipophilic is the dye, the more acute is the damage to the RBC membrane and the oxidation of indol, which is immersed in the hydrophobic region of albumin.
Resumo:
En e-guide är ett fenomen som kan ge mervärde för besökaren. Det innebär en övergång ifrån en mänsklig guide till en digitaliserad. Det har tidigare skapats e-guider som inte har haft de tekniska aspekterna som krävs för att kunna framföra informationen till besökaren på det sätt utvecklarna och andra inblandade har önskat. Vi har undersökt och utvärderat en av dessa eguiderför att skapa en prototyp för en e-guide över Elsborg, som är en del av Världsarvet Falun. Prototypen utgår ifrån smartphones och operativsystemet Android. Utifrån tidigare och nutida eguider och prototypen, har vi analyserat resultaten och därigenom undersökt möjligheten för att skapa en generisk arkitektur, i form av en applikation för en e-guide som ska kunna appliceras på olika sammanhang oberoende av dess innehåll. Våra efterforskningar har dock visat att det ärsvårt att skapa en så pass generaliserad mall att den direkt går att implementera utan vissa justeringar.
Resumo:
It is now-a-days more and more common in the academic world to use new forms of “learning-tools”. One of those is the “reflection protocol”, which usually consist of a few pages of freely written text, related to something the students have read. There seems to be a lot of different opinions about the value to use this method. Some teachers and students are enthusiastic and others are rather critical. To write a “reflection protocol” is not in the first place to do a summery, a review, not even to analyze a text. Instead it is about to write down thoughts and questions that comes up as a result of the reading. It is also about doing associations, reflections and to interpret a text and relate this to a theme of some kind. The purpose to use “reflection protocols” is, as we see it, mainly for the student to practice independent thinking from a scientific point of view, but it also gives a possibility to a better understanding of another person’s thinking. This seems to open up for a fruitful dialogue and a way to learn. We will in this paper discuss if that could be the case.
Resumo:
BACKGROUND: With a pending need to identify potential means to improved quality of care, national quality registries (NQRs) are identified as a promising route. Yet, there is limited evidence with regards to what hinders and facilitates the NQR innovation, what signifies the contexts in which NQRs are applied and drive quality improvement. Supposedly, barriers and facilitators to NQR-driven quality improvement may be found in the healthcare context, in the politico-administrative context, as well as with an NQR itself. In this study, we investigated the potential variation with regards to if and how an NQR was applied by decision-makers and users in regions and clinical settings. The aim was to depict the interplay between the clinical and the politico-administrative tiers in the use of NQRs to develop quality of care, examining an established registry on stroke care as a case study. METHODS: We interviewed 44 individuals representing the clinical and the politico-administrative settings of 4 out of 21 regions strategically chosen for including stroke units representing a variety of outcomes in the NQR on stroke (Riksstroke) and a variety of settings. The transcribed interviews were analysed by applying The Consolidated Framework for Implementation Research (CFIR). RESULTS: In two regions, decision-makers and/or administrators had initiated healthcare process projects for stroke, engaging the health professionals in the local stroke units who contributed with, for example, local data from Riksstroke. The Riksstroke data was used for identifying improvement issues, for setting goals, and asserting that the stroke units achieved an equivalent standard of care and a certain level of quality of stroke care. Meanwhile, one region had more recently initiated such a project and the fourth region had no similar collaboration across tiers. Apart from these projects, there was limited joint communication across tiers and none that included all individuals and functions engaged in quality improvement with regards to stroke care. CONCLUSIONS: If NQRs are to provide for quality improvement and learning opportunities, advances must be made in the links between the structures and processes across all organisational tiers, including decision-makers, administrators and health professionals engaged in a particular healthcare process.
Resumo:
Inúmeras questões terríveis e alarmantes são ainda mal resolvidas, apesar da mobilização de ONGs para aliviá-los. Por muito tempo, o setor privado deu as costas a preocupações tal qual estas. Ate que um novo tipo de empreendedor revolucionário apareceu com um novo conceito para combater a pobreza. Mohamed Yunus desbravou empreendedorismo social quando criou a Grameen Bank 36 anos atrás: ele desafiou regras convencionais e estritas alugando dinheiro para Bengalis desmerecidos de credito, tudo isso obtendo lucro no mesmo tempo. Hoje, empreendedorismo social esta um fenômeno mas a maioria dos empreendedores do setor dos e meia ainda enfrentam dificuldades. A pesquisa acadêmica sobre o empreendedorismo social com fins lucrativos ainda está hesitante. O presente trabalho é uma modesta tentativa de analisar quais são os desafios que um empreendedor social com fins lucrativos enfrentará ao longo do caminho para criar seu empreendimento e sustentar os seus objetivos. O exame da literatura mostra que as dificuldades enfrentadas pelos empreendedores são devido a vários fatores, compreendo questões diretamente relacionadas a incerteza do mercado e o contexto local, questões organizacionais, de financiamento, de ética e questões relacionadas a resistência do modelo de negocio. As proposições derivando do exame da literatura foram confrontadas a casos concretos através de entrevistas com empreendedores sociais, investidores de impacto e instituições de apoio. Resultados da pesquisa corroboram as proposições do inicio mas enfatizam necessidade de resolver, com consideração cuidadosa, as questões relacionadas a incerteza do mercado e ao desenho duma governança adequada. A respeito da incerteza do mercado, a identificação das partes interessadas no empreendimento social e a adoção duma mentalidade eficaz para ajustar suposições iniciais para a realidade local, são um padrão chave de sucesso para o empreendimento social. No nível organizacional, a constituição dum time perito e comprometido junto com o desenho duma governança certa para equilibrar o desejo de obter lucro e a necessidade de sustentabilidade financeira é uma garantia de sucesso para o empreendedor social.
Resumo:
In present research, headspace solid-phase microextraction (HS-SPME) followed by gas chromatography–mass spectrometry (GC–qMS), was evaluated as a reliable and improved alternative to the commonly used liquid–liquid extraction (LLE) technique for the establishment of the pattern of hydrolytically released components of 7 Vitis vinifera L. grape varieties, commonly used to produce the world-famous Madeira wine. Since there is no data available on their glycosidic fractions, at a first step, two hydrolyse procedures, acid and enzymatic, were carried out using Boal grapes as matrix. Several parameters susceptible of influencing the hydrolytic process were studied. The best results, expressed as GC peak area, number of identified components and reproducibility, were obtained using ProZym M with b-glucosidase activity at 35 °C for 42 h. For the extraction of hydrolytically released components, HS-SPME technique was evaluated as a reliable and improved alternative to the conventional extraction technique, LLE (ethyl acetate). HS-SPME using DVB/CAR/PDMS as coating fiber displayed an extraction capacity two fold higher than LLE (ethyl acetate). The hydrolyzed fraction was mainly characterized by the occurrence of aliphatic and aromatic alcohols, followed by acids, esters, carbonyl compounds, terpenoids, and volatile phenols. Concerning to terpenoids its contribution to the total hydrolyzed fraction is highest for Malvasia Cândida (23%) and Malvasia Roxa (13%), and their presence according previous studies, even at low concentration, is important from a sensorial point of view (can impart floral notes to the wines), due to their low odor threshold (μg/L). According to the obtained data by principal component analysis (PCA), the sensorial properties of Madeira wines produced by Malvasia Cândida and Malvasia Roxa could be improved by hydrolysis procedure, since their hydrolyzed fraction is mainly characterized by terpenoids (e.g. linalool, geraniol) which are responsible for floral notes. Bual and Sercial grapes are characterized by aromatic alcohols (e.g. benzyl alcohol, 2-phenylethyl alcohol), so an improvement in sensorial characteristics (citrus, sweet and floral odors) of the corresponding wines, as result of hydrolytic process, is expected.
Resumo:
In this paper we present and demonstrate a technique that allows simultaneous and independent measurement of small changes in the refractive index and the absorption coefficient produced in photosensitive materials during holographic exposure. The technique is based on the synchronous detection of two-wave mixing signals in both directions of the transmitted interfering beams. By processing both signals it is possible to separate the diffraction contributions of the refractive index from the absorption coefficient and simultaneously stabilize the incident fringe pattern. The demonstration of this technique is undertaken by following the temporal evolution of the phase and amplitude modulations in photoresist films. To check the ability of the technique to perform numeric evaluations, for a positive photoresist the changes in the optical constants were measured and compared with those obtained using independent methods.
Resumo:
In this paper it is presented recent developments in the heterodyne detection holographic techniques for studding photosensitive materials. The actual state of the technique allows simultaneous and independent measurement of the refractive index and of the absorption coefficient changes in photosensitive materials and their use to self-stabilize the fringe pattern. The modeling of the measured signal together with the fringe stabilization allow the long term-fitting of the optical properties and the study the photosensitive materials close to the saturation.