958 resultados para Curvelet transform
Resumo:
The Mathematica system (version 4.0) is employed in the solution of nonlinear difusion and convection-difusion problems, formulated as transient one-dimensional partial diferential equations with potential dependent equation coefficients. The Generalized Integral Transform Technique (GITT) is first implemented for the hybrid numerical-analytical solution of such classes of problems, through the symbolic integral transformation and elimination of the space variable, followed by the utilization of the built-in Mathematica function NDSolve for handling the resulting transformed ODE system. This approach ofers an error-controlled final numerical solution, through the simultaneous control of local errors in this reliable ODE's solver and of the proposed eigenfunction expansion truncation order. For covalidation purposes, the same built-in function NDSolve is employed in the direct solution of these partial diferential equations, as made possible by the algorithms implemented in Mathematica (versions 3.0 and up), based on application of the method of lines. Various numerical experiments are performed and relative merits of each approach are critically pointed out.
Resumo:
In this paper we present an algorithm for the numerical simulation of the cavitation in the hydrodynamic lubrication of journal bearings. Despite the fact that this physical process is usually modelled as a free boundary problem, we adopted the equivalent variational inequality formulation. We propose a two-level iterative algorithm, where the outer iteration is associated to the penalty method, used to transform the variational inequality into a variational equation, and the inner iteration is associated to the conjugate gradient method, used to solve the linear system generated by applying the finite element method to the variational equation. This inner part was implemented using the element by element strategy, which is easily parallelized. We analyse the behavior of two physical parameters and discuss some numerical results. Also, we analyse some results related to the performance of a parallel implementation of the algorithm.
Resumo:
This thesis discusses the opportunities and challenges of the cloud computing technology in healthcare information systems by reviewing the existing literature on cloud computing and healthcare information system and the impact of cloud computing technology to healthcare industry. The review shows that if problems related to security of data are solved then cloud computing will positively transform the healthcare institutions by giving advantage to the healthcare IT infrastructure as well as improving and giving benefit to healthcare services. Therefore, this thesis will explore the opportunities and challenges that are associated with cloud computing in the context of Finland in order to help the healthcare organizations and stakeholders to determine its direction when it decides to adopt cloud technology on their information systems.
Resumo:
Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.
Resumo:
In this study, cantilever-enhanced photoacoustic spectroscopy (CEPAS) was applied in different drug detection schemes. The study was divided into two different applications: trace detection of vaporized drugs and drug precursors in the gas-phase, and detection of cocaine abuse in hair. The main focus, however, was the study of hair samples. In the gas-phase, methyl benzoate, a hydrolysis product of cocaine hydrochloride, and benzyl methyl ketone (BMK), a precursor of amphetamine and methamphetamine were investigated. In the solid-phase, hair samples from cocaine overdose patients were measured and compared to a drug-free reference group. As hair consists mostly of long fibrous proteins generally called keratin, proteins from fingernails and saliva were also studied for comparison. Different measurement setups were applied in this study. Gas measurements were carried out using quantum cascade lasers (QLC) as a source in the photoacoustic detection. Also, an external cavity (EC) design was used for a broader tuning range. Detection limits of 3.4 particles per billion (ppb) for methyl benzoate and 26 ppb for BMK in 0.9 s were achieved with the EC-QCL PAS setup. The achieved detection limits are sufficient for realistic drug detection applications. The measurements from drug overdose patients were carried out using Fourier transform infrared (FTIR) PAS. The drug-containing hair samples and drug-free samples were both measured with the FTIR-PAS setup, and the measured spectra were analyzed statistically with principal component analysis (PCA). The two groups were separated by their spectra with PCA and proper spectral pre-processing. To improve the method, ECQCL measurements of the hair samples, and studies using photoacoustic microsampling techniques, were performed. High quality, high-resolution spectra with a broad tuning range were recorded from a single hair fiber. This broad tuning range of an EC-QCL has not previously been used in the photoacoustic spectroscopy of solids. However, no drug detection studies were performed with the EC-QCL solid-phase setup.
Resumo:
Global warming is assertively the greatest environmental challenge for humans of 21st century. It is primarily caused by the anthropogenic greenhouse gas (GHG) that trap heat in the atmosphere. Because of which, the GHG emission mitigation, globally, is a critical issue in the political agenda of all high-profile nations. India, like other developing countries, is facing this threat of climate change while dealing with the challenge of sustaining its rapid economic growth. India’s economy is closely connected to its natural resource base and climate sensitive sectors like water, agriculture and forestry. Due to Climate change the quality and distribution of India’s natural resources may transform and lead to adverse effects on livelihood of its people. Therefore, India is expected to face a major threat due to the projected climate change. This study proposes possible solutions for GHG emission mitigation that are specific to the power sector of India. The methods discussed here will take Indian power sector from present coal dominant ideology to a system, centered with renewable energy sources. The study further proposes a future scenario for 2050, based on the present Indian government policies and global energy technologies advancements.
Resumo:
A fitorremediação é uma técnica que objetiva a descontaminação de solo e água, utilizando-se como agente de descontaminação plantas. É uma alternativa aos métodos convencionais de bombeamento e tratamento da água, ou remoção física da camada contaminada de solo, sendo vantajosa principalmente por apresentar potencial para tratamento in situ e ser economicamente viável. Além disso, após extrair o contaminante do solo, a planta armazena-o para tratamento subseqüente, quando necessário, ou mesmo metaboliza-o, podendo, em alguns casos, transformá-lo em produtos menos tóxicos ou mesmo inócuos. A fitorremediação pode ser empregada em solos contaminados por substâncias inorgânicas e/ou orgânicas. Resultados promissores de fitorremediação já foram obtidos para metais pesados, hidrocarbonetos de petróleo, agrotóxicos, explosivos, solventes clorados e subprodutos tóxicos da indústria. A fitorremediação de herbicidas apresenta bons resultados para atrazine, tendo a espécie Kochia scoparia revelado potencial rizosférico para fitoestimular a degradação dessa molécula. Embora ainda incipiente no Brasil, já existem estudos sobre algumas espécies agrícolas cultivadas e espécies silvestres ou nativas da própria área contaminada, com o objetivo de selecionar espécies eficientes na fitorremediação do solo.
Resumo:
The interferometer for low resolution portable Fourier Transform middle infrared spectrometer was developed and studied experimentally. The final aim was a concept for a commercial prototype. Because of the portability, the interferometer should be compact sized and insensitive to the external temperature variations and mechanical vibrations. To minimise the size and manufacturing costs, Michelson interferometer based on plane mirrors and porch swing bearing was selected and no dynamic alignment system was applied. The driving motor was a linear voice coil actuator to avoid mechanical contact of the moving parts. The driving capability for low mirror driving velocities required by the photoacoustic detectors was studied. In total, four versions of such an interferometer were built and experimentally studied. The thermal stability during the external temperature variations and the alignment stability over the mirror travel were measured using the modulation depth of the wide diameter laser beam. Method for estimating the mirror tilt angle from the modulation depth was developed to take account the effect from the non-uniform intensity distribution of the laser beam. The spectrometer stability was finally studied also using the infrared radiation. The latest interferometer was assembled for the middle infrared spectrometer with spectral range from 750 cm−1 to 4500 cm−1. The interferometer size was (197 × 95 × 79) mm3 with the beam diameter of 25 mm. The alignment stability as the change of the tilt angle over the mirror travel of 3 mm was 5 μrad, which decreases the modulation depth only about 0.7 percent in infrared at 3000 cm−1. During the temperature raise, the modulation depth at 3000 cm−1 changed about 1 . . . 2 percentage units per Celsius over short term and even less than 0.2 percentage units per Celsius over the total temperature raise of 30 °C. The unapodised spectral resolution was 4 cm−1 limited by the aperture size. The best achieved signal to noise ratio was about 38 000:1 with commercially available DLaTGS detector. Although the vibration sensitivity requires still improving, the interferometer performed, as a whole, very well and could be further developed to conform all the requirements of the portable and stable spectrometer.
Resumo:
End-user development is a very common but often largely overlooked phenomenon in information systems research and practice. End-user development means that regular people, the end-users of software, and not professional developers are doing software development. A large number of people are directly or indirectly impacted by the results of these non-professional development activities. The numbers of users performing end-user development activities are difficult to ascertain precisely. But it is very large, and still growing. Computer adoption is growing towards 100% and many new types of computational devices are continually introduced. In addition, other devices not previously programmable are becoming so. This means that, at this very moment, hundreds of millions of people are likely struggling with development problems. Furthermore, software itself is continually being adapted for more flexibility, enabling users to change the behaviour of their software themselves. New software and services are helping to transform users from consumers to producers. Much of this is now found on-line. The problem for the end-user developer is that little of this development is supported by anyone. Often organisations do not notice end-user development and consequently neither provide support for it, nor are equipped to be able to do so. Many end-user developers do not belong to any organisation at all. Also, the end-user development process may be aggravating the problem. End-users are usually not really committed to the development process, which tends to be more iterative and ad hoc. This means support becomes a distant third behind getting the job done and figuring out the development issues to get the job done. Sometimes the software itself may exacerbate the issue by simplifying the development process, deemphasising the difficulty of the task being undertaken. On-line support could be the lifeline the end-user developer needs. Going online one can find all the knowledge one could ever need. However, that does still not help the end-user apply this information or knowledge in practice. A virtual community, through its ability to adopt the end-user’s specific context, could surmount this final obstacle. This thesis explores the concept of end-user development and how it could be supported through on-line sources, in particular virtual communities, which it is argued here, seem to fit the end-user developer’s needs very well. The experiences of real end-user developers and prior literature were used in this process. Emphasis has been on those end-user developers, e.g. small business owners, who may have literally nowhere to turn to for support. Adopting the viewpoint of the end-user developer, the thesis examines the question of how an end-user could use a virtual community effectively, improving the results of the support process. Assuming the common situation where the demand for support outstrips the supply.
Resumo:
Sosialistisen vallankumouksen jälkeen Neuvosto-Venäjän oli monien muiden kysymysten ohella ratkaistava sosialismin saavutusten puolustaminen. Aluksi ratkaisuksi suunniteltiin vapaaehtoisuuteen perustuvaa punakaartia, mutta riittävän miesvahvuuden turvaamiseksi päädyttiin yleiseen asevelvollisuuteen. Pian Venäjän sisällissodan jälkeen sotataidon suunta painottui enemmän vanhan armeijan asiantuntijoiden näkemysten kuin vallankumoussankarien kokemusten mukaiseksi, vaikka Frunzen puna-armeijalle kirjoittama doktriini perustui luokkataisteluun ja korosti sisällissodassa hyväksi koettua operatiivista liikkuvuutta. Neuvostoliiton ja Venäjän sotataidon perustana on Pietari I:n aloittama länsimainen suuntaus, jota kuitenkin täydentävät vahvat kansalliset piirteet. Venäläisen sotataidon henkisenä isänä voidaan hyvällä syyllä pitää Aleksandr Suvorovia, jonka opetukset näkyvät tekstilainausten lisäksi myös periaatteissa ja sotilaskasvatuksessa. Napoleonin sotien jälkeen perustettu Keisarillinen yleisesikunta-akatemia loi Venäjälle sotatieteellisen tutkimuksen ja opetuksen. Sotatieteen mahdollisuuksia ei 1800-luvun Venäjällä osattu täysin hyödyntää. Aseistuksen kasvavan tehon merkitystä vähättelevä asenne johti sotataidon taantumisen ja katastrofiin Venäjän–Japanin sodassa. Sen kokemuksia analysoidessaan Aleksandr Neznamov kehitti edelleen saksalaista operaation käsitettä ja loi perustan Neuvostoliitossa 1920-luvulla kehitetylle operaatiotaidolle. Neuvostoliittolaisen sotataidon päämääränä oli kehittää taktinen ja operatiivinen ratkaisu aseistuksen tehon kasvun aikaansaamaan puolustuksen ylivoimaisuuteen. Ratkaisussa hyödynnettiin brittien kokemuksia ja tutkimusta. Neuvostoliittolainen taktiikka ja operaatiotaito eivät kuitenkaan olleet brittiläisen mekanisoidun sodankäynnin tai saksalaisen salamasodan itäinen kopioita vaan itsenäisiin ratkaisuihin pohjautuvia. Syvän taistelun ja operaation teoriaa kokeiltiin harjoituksissa, ja sitä kehitettiin Stalinin vuoden 1937 puhdistuksiin saakka. Toisen maailmansodan taisteluissa puna-armeija sovelsi alkuvaiheen katastrofin jälkeen syvän taistelun ja syvän operaation oppeja. Komentajien ja joukkojen taito ei riittänyt teorian vaatimusten mukaiseen toimintaan, siksi syväksi aiotusta taistelusta tuli ajoittain ainoastaan tiheää. Suuren isänmaallisen sodan kokemusten perusteella neuvostoliittolainen sotatiede kehitti yleisjoukkojen taistelun periaatteet, jotka ovat säilyneet muuttumattomina nykypäivään saakka. Kylmän sodan aikakaudella ydin- ja tavanomaisen aseistuksen merkitys sodan ja taistelun kuvassa vaihteli. Lännen sotataidon ja aseteknologian kehitys pakotti Neuvostoliiton siirtymään 1980-luvulla sotilaallisessa ajattelussaan hyökkäyksestä puolustukseen. Neuvostoliiton hajoamisen jälkeen Venäjän sotilaallisen turvallisuuden takaajana on ydinaseistus. Yhdysvaltain tavanomainen ilma-avaruushyökkäyskyky vaatii Venäjää kehittämään torjuntajärjestelmiä. Tavanomaisten joukkojen rakentamisessa Venäjä seuraa tarkasti läntisen sotataidon kehittymistä, mutta pitäytyy omaperäisiin ratkaisuihin, joiden kehittämisessä sen vahvalla sotatieteellisellä järjestelmällä ja dialektisen materialismin metodilla on edelleen olennaisen tärkeä merkitys.
Resumo:
Biokuvainformatiikan kehittäminen – mikroskopiasta ohjelmistoratkaisuihin – sovellusesimerkkinä α2β1-integriini Kun ihmisen genomi saatiin sekvensoitua vuonna 2003, biotieteiden päätehtäväksi tuli selvittää eri geenien tehtävät, ja erilaisista biokuvantamistekniikoista tuli keskeisiä tutkimusmenetelmiä. Teknologiset kehitysaskeleet johtivat erityisesti fluoresenssipohjaisten valomikroskopiatekniikoiden suosion räjähdysmäiseen kasvuun, mutta mikroskopian tuli muuntua kvalitatiivisesta tieteestä kvantitatiiviseksi. Tämä muutos synnytti uuden tieteenalan, biokuvainformatiikan, jonka on sanottu mahdollisesti mullistavan biotieteet. Tämä väitöskirja esittelee laajan, poikkitieteellisen työkokonaisuuden biokuvainformatiikan alalta. Väitöskirjan ensimmäinen tavoite oli kehittää protokollia elävien solujen neliulotteiseen konfokaalimikroskopiaan, joka oli yksi nopeimmin kasvavista biokuvantamismenetelmistä. Ihmisen kollageenireseptori α2β1-integriini, joka on tärkeä molekyyli monissa fysiologisissa ja patologisissa prosesseissa, oli sovellusesimerkkinä. Työssä saavutettiin selkeitä visualisointeja integriinien liikkeistä, yhteenkeräytymisestä ja solun sisään siirtymisestä, mutta työkaluja kuvainformaation kvantitatiiviseen analysointiin ei ollut. Väitöskirjan toiseksi tavoitteeksi tulikin tällaiseen analysointiin soveltuvan tietokoneohjelmiston kehittäminen. Samaan aikaan syntyi biokuvainformatiikka, ja kipeimmin uudella alalla kaivattiin erikoistuneita tietokoneohjelmistoja. Tämän väitöskirjatyön tärkeimmäksi tulokseksi muodostui näin ollen BioImageXD, uudenlainen avoimen lähdekoodin ohjelmisto moniulotteisten biokuvien visualisointiin, prosessointiin ja analysointiin. BioImageXD kasvoi yhdeksi alansa suurimmista ja monipuolisimmista. Se julkaistiin Nature Methods -lehden biokuvainformatiikkaa käsittelevässä erikoisnumerossa, ja siitä tuli tunnettu ja laajalti käytetty. Väitöskirjan kolmas tavoite oli soveltaa kehitettyjä menetelmiä johonkin käytännönläheisempään. Tehtiin keinotekoisia piidioksidinanopartikkeleita, joissa oli "osoitelappuina" α2β1-integriinin tunnistavia vasta-aineita. BioImageXD:n avulla osoitettiin, että nanopartikkeleilla on potentiaalia lääkkeiden täsmäohjaussovelluksissa. Tämän väitöskirjatyön yksi perimmäinen tavoite oli edistää uutta ja tuntematonta biokuvainformatiikan tieteenalaa, ja tämä tavoite saavutettiin erityisesti BioImageXD:n ja sen lukuisten julkaistujen sovellusten kautta. Väitöskirjatyöllä on merkittävää potentiaalia tulevaisuudessa, mutta biokuvainformatiikalla on vakavia haasteita. Ala on liian monimutkainen keskimääräisen biolääketieteen tutkijan hallittavaksi, ja alan keskeisin elementti, avoimen lähdekoodin ohjelmistokehitystyö, on aliarvostettu. Näihin seikkoihin tarvitaan useita parannuksia,
Resumo:
Can crowdsourcing solutions serve many masters? Can they be beneficial for both, for the layman or native speakers of minority languages on the one hand and serious linguistic research on the other? How did an infrastructure that was designed to support linguistics turn out to be a solution for raising awareness of native languages? Since 2012 the National Library of Finland has been developing the Digitisation Project for Kindred Languages, in which the key objective is to support a culture of openness and interaction in linguistic research, but also to promote crowdsourcing as a tool for participation of the language community in research. In the course of the project, over 1,200 monographs and nearly 111,000 pages of newspapers in Finno-Ugric languages will be digitised and made available in the Fenno-Ugrica digital collection. This material was published in the Soviet Union in the 1920s and 1930s, and users have had only sporadic access to the material. The publication of open-access and searchable materials from this period is a goldmine for researchers. Historians, social scientists and laymen with an interest in specific local publications can now find text materials pertinent to their studies. The linguistically-oriented population can also find writings to delight them: (1) lexical items specific to a given publication, and (2) orthographically-documented specifics of phonetics. In addition to the open access collection, we developed an open source code OCR editor that enables the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary since these rare and peripheral prints often include already archaic characters, which are neglected by modern OCR software developers but belong to the historical context of kindred languages, and are thus an essential part of the linguistic heritage. When modelling the OCR editor, it was essential to consider both the needs of researchers and the capabilities of lay citizens, and to have them participate in the planning and execution of the project from the very beginning. By implementing the feedback iteratively from both groups, it was possible to transform the requested changes as tools for research that not only supported the work of linguistics but also encouraged the citizen scientists to face the challenge and work with the crowdsourcing tools for the benefit of research. This presentation will not only deal with the technical aspects, developments and achievements of the infrastructure but will highlight the way in which user groups, researchers and lay citizens were engaged in a process as an active and communicative group of users and how their contributions were made to mutual benefit.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.