986 resultados para Validated Interval Software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and stable time series of geodetic parameters can be used to help in understanding the dynamic Earth and its response to global change. The Global Positioning System, GPS, has proven to be invaluable in modern geodynamic studies. In Fennoscandia the first GPS networks were set up in 1993. These networks form the basis of the national reference frames in the area, but they also provide long and important time series for crustal deformation studies. These time series can be used, for example, to better constrain the ice history of the last ice age and the Earth s structure, via existing glacial isostatic adjustment models. To improve the accuracy and stability of the GPS time series, the possible nuisance parameters and error sources need to be minimized. We have analysed GPS time series to study two phenomena. First, we study the refraction in the neutral atmosphere of the GPS signal, and, second, we study the surface loading of the crust by environmental factors, namely the non-tidal Baltic Sea, atmospheric load and varying continental water reservoirs. We studied the atmospheric effects on the GPS time series by comparing the standard method to slant delays derived from a regional numerical weather model. We have presented a method for correcting the atmospheric delays at the observational level. The results show that both standard atmosphere modelling and the atmospheric delays derived from a numerical weather model by ray-tracing provide a stable solution. The advantage of the latter is that the number of unknowns used in the computation decreases and thus, the computation may become faster and more robust. The computation can also be done with any processing software that allows the atmospheric correction to be turned off. The crustal deformation due to loading was computed by convolving Green s functions with surface load data, that is to say, global hydrology models, global numerical weather models and a local model for the Baltic Sea. The result was that the loading factors can be seen in the GPS coordinate time series. Reducing the computed deformation from the vertical time series of GPS coordinates reduces the scatter of the time series; however, the long term trends are not influenced. We show that global hydrology models and the local sea surface can explain up to 30% of the GPS time series variation. On the other hand atmospheric loading admittance in the GPS time series is low, and different hydrological surface load models could not be validated in the present study. In order to be used for GPS corrections in the future, both atmospheric loading and hydrological models need further analysis and improvements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Mutations in IDH3B, an enzyme participating in the Krebs cycle, have recently been found to cause autosomal recessive retinitis pigmentosa (arRP). The MDH1 gene maps within the RP28 arRP linkage interval and encodes cytoplasmic malate dehydrogenase, an enzyme functionally related to IDH3B. As a proof of concept for candidate gene screening to be routinely performed by ultra high throughput sequencing (UHTs), we analyzed MDH1 in a patient from each of the two families described so far to show linkage between arRP and RP28. Methods: With genomic long-range PCR, we amplified all introns and exons of the MDH1 gene (23.4 kb). PCR products were then sequenced by short-read UHTs with no further processing. Computer-based mapping of the reads and mutation detection were performed by three independent software packages. Results: Despite the intrinsic complexity of human genome sequences, reads were easily mapped and analyzed, and all algorithms used provided the same results. The two patients were homozygous for all DNA variants identified in the region, which confirms previous linkage and homozygosity mapping results, but had different haplotypes, indicating genetic or allelic heterogeneity. None of the DNA changes detected could be associated with the disease. Conclusions: The MDH1 gene is not the cause of RP28-linked arRP. Our experimental strategy shows that long-range genomic PCR followed by UHTs provides an excellent system to perform a thorough screening of candidate genes for hereditary retinal degeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The StreamIt programming model has been proposed to exploit parallelism in streaming applications on general purpose multi-core architectures. This model allows programmers to specify the structure of a program as a set of filters that act upon data, and a set of communication channels between them. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on modern Graphics Processing Units (GPUs), as they support abundant parallelism in hardware. In this paper, we describe the challenges in mapping StreamIt to GPUs and propose an efficient technique to software pipeline the execution of stream programs on GPUs. We formulate this problem - both scheduling and assignment of filters to processors - as an efficient Integer Linear Program (ILP), which is then solved using ILP solvers. We also describe a novel buffer layout technique for GPUs which facilitates exploiting the high memory bandwidth available in GPUs. The proposed scheduling utilizes both the scalar units in GPU, to exploit data parallelism, and multiprocessors, to exploit task and pipelin parallelism. Further it takes into consideration the synchronization and bandwidth limitations of GPUs, and yields speedups between 1.87X and 36.83X over a single threaded CPU.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Mutations in IDH3B, an enzyme participating in the Krebs cycle, have recently been found to cause autosomal recessive retinitis pigmentosa (arRP). The MDH1 gene maps within the RP28 arRP linkage interval and encodes cytoplasmic malate dehydrogenase, an enzyme functionally related to IDH3B. As a proof of concept for candidate gene screening to be routinely performed by ultra high throughput sequencing (UHTs), we analyzed MDH1 in a patient from each of the two families described so far to show linkage between arRP and RP28. Methods: With genomic long-range PCR, we amplified all introns and exons of the MDH1 gene (23.4 kb). PCR products were then sequenced by short-read UHTs with no further processing. Computer-based mapping of the reads and mutation detection were performed by three independent software packages. Results: Despite the intrinsic complexity of human genome sequences, reads were easily mapped and analyzed, and all algorithms used provided the same results. The two patients were homozygous for all DNA variants identified in the region, which confirms previous linkage and homozygosity mapping results, but had different haplotypes, indicating genetic or allelic heterogeneity. None of the DNA changes detected could be associated with the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We all have fresh in our memory what happened to the IT sector only a few years ago when the IT-bubble burst. The upswing of productivity in this sector slowed down, investors lost large investments, many found themselves looking for a new job, and countless dreams fell apart. Product developers in the IT sector have experienced a large number of organizational restructurings since the IT boom, including rapid growth, downsizing processes, and structural reforms. Organizational restructurings seem to be a complex and continuous phenomenon people in this sector have to deal with. How do software product developers retrospectively construct their work in relation to organizational restructurings? How do organizational restructurings bring about specific social processes in product development? This working paper focuses on these questions. The overall aim is to develop an understanding of how software product developers construct their work during organizational restructurings. The theoretical frame of reference is based on a social constructionist approach and discourse analysis. This approach offers more or less radical and critical alternatives to mainstream organizational theory. Writings from this perspective attempt to investigate and understand sociocultural processes by which various realities are created. Therefore these studies aim at showing how people participate in constituting the social world (Gergen & Thatchenkery, 1996); knowledge of the world is seen to be constructed between people in daily interaction, in which language plays a central role. This means that interaction, especially the ways of talking and writing about product development during organizational restructurings, become the target of concern. This study consists of 25 in-depth interviews following a pilot study based on 57 semi-structured interviews. In this working paper I analyze 9 in-depth interviews. The interviews were conducted in eight IT firms. The analysis explores how discourses are constructed and function, as well as the consequences that follow from different discourses. The analysis shows that even though the product developers have experienced many organizational restructurings, some of which have been far-reaching, their accounts build strongly on a stability discourse. According to this discourse product development is, perhaps surprisingly, not influenced to a great extent by organizational restructurings. This does not mean that product development is static. According to the social constructionist approach, product development is constantly being reproduced and maintained in ongoing processes. In other words stable effects are also ongoing achievements and these are of particular interest in this study. The product developers maintain rather than change the product development through ongoing processes of construction, even when they experience continuous extensive organizational restructurings. The discourse of stability exists alongside other discourses, some which contradict each other. Together they direct product development and generate meanings. The product developers consequently take an active role in the construction of their work during organizational restructurings. When doing this they also negotiate credible positions for themselves

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many Finnish IT companies have gone through numerous organizational changes over the past decades. This book draws attention to how stability may be central to software product development experts and IT workers more generally, who continuously have to cope with such change in their workplaces. It does so by analyzing and theorizing change and stability as intertwined and co-existent, thus throwing light on how it is possible that, for example, even if ‘the walls fall down the blokes just code’ and maintain a sense of stability in their daily work. Rather than reproducing the picture of software product development as exciting cutting edge activities and organizational change as dramatic episodes, the study takes the reader beyond the myths surrounding these phenomena to the mundane practices, routines and organizings in product development during organizational change. An analysis of these ordinary practices offers insights into how software product development experts actively engage in constructing stability during organizational change through a variety of practices, including solidarity, homosociality, close relations to products, instrumental or functional views on products, preoccupations with certain tasks and humble obedience. Consequently, the study shows that it may be more appropriate to talk about varieties of stability, characterized by a multitude of practices of stabilizing rather than states of stagnation. Looking at different practices of stability in depth shows the creation of software as an arena for micro-politics, power relations and increasing pressures for order and formalization. The thesis gives particular attention to power relations and processes of positioning following organizational change: how social actors come to understand themselves in the context of ongoing organizational change, how they comply with and/or contest dominant meanings, how they identify and dis-identify with formalization, and how power relations often are reproduced despite dis-identification. Related to processes of positioning, the reader is also given a glimpse into what being at work in a male-dominated and relatively homogeneous work environment looks like. It shows how the strong presence of men or “blokes” of a particular age and education seems to become invisible in workplace talk that appears ‘non-conscious’ of gender.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Syövän diagnostiikassa ja hoidossa nanopartikkelit voivat toimia kuljetinaineina lääke- ja diagnostisille aineille tai nukleiinihappojaksoille. Kantaja-aineeseen voidaan liittää kohdennusmolekyylejä partikkelien passiivista tai aktiivista kohdennusta varten tai radioleima kuvantamista tai radioterapiaa varten. Kantaja-aineiden avulla voidaan parantaa lääkeaineen fysikaalis-kemiallisia ominaisuuksia ja biologista hyötyosuutta, vähentää systeemisiä sivuvaikutuksia, pidentää lääkeaineen puoliintumisaikaa ja siten harventaa annosteluväliä, sekä parantaa lääkeaineen pääsyä kohdekudokseen. Näin voidaan parantaa kemo- ja radioterapian tehoa ja hoidon onnistumisen todennäköisyyttä. Kirjallisuuskatsauksessa perehdytään nanokantajien rooliin syövän hoidossa. Vuosikymmeniä jatkuneesta tutkimuksesta huolimatta vain kaksi (Eurooppa) tai kolme (Yhdysvallat) nanopartikkeliformulaatiota on hyväksytty markkinoille syövän hoidossa. Ongelmina ovat riittämätön hakeutuminen kohdekudokseen, immunogeenisyys ja nanopartikkelien labiilius. Kokeellisessa osassa tutkitaan in vitro ja hiirillä in vivo 99mTc-leimattujen, PEG-verhoiltujen biotiiniliposomien kaksivaiheista kohdennusta ihmisen munasarjan adenokarsinoomasoluihin. Kohdentamiseen käytetään biotinyloitua setuksimabi-(Erbitux®) vasta-ainetta, joka sitoutuu solujen yli-ilmentämiin EGF-reseptoreihin. Kaksivaiheista kohdennusta verrataan suoraan ja/tai passiiviseen kohdennukseen. Tehokkaampien kuvantamismenetelmien kehitys on vauhdittanut kohdennettujen nanopartikkelien tutkimusta. Isotooppikuvantamista käyttäen pystytään seuraamaan radioleiman jakautumista elimistössä ja kuvantamaan solutasolla tapahtuvia ilmiöitä. Kirjallisuuskatsauksessa perehdytään SPECT- ja PET-kuvantamiseen syövän hoidossa, sekä niiden hyödyntämiseen lääkekehityksessä nanopartikkelien kuvantamisessa. Kyseiset kuvantamismenetelmät erottuvat muista menetelmistä korkean erotuskyvyn, herkkyyden ja helppokäyttöisyyden suhteen. Kokeellisessa osassa 99mTc-leimattujen liposomien distribuutiota hiirissä tutkittiin SPECT-CT-laitteen avulla. Aktiivisuus kasvaimessa, pernassa ja maksassa kvantifioitiin InVivoScope-ohjelman ja gammalaskijan avulla. Tuloksia verrattiin keskenään. In vitro-kokeessa saavutettiin kaksivaiheisella kohdennuksella 2,7- 3,5-kertainen (solulinjasta riippuen) hakeutuminen soluihin kontrolliliposomeihin verrattuna. Kuitenkin suora kohdennus toimi kaksivaiheista kohdennusta paremmin in vitro. In vivo –kokeissa liposomit jakautuivat kasvaimeen tehokkaammin i.p.-annosteltuna kuin i.v.-annosteltuna. Kaksivaiheisella kohdennuksella saavutettiin 1,24-kertainen jakautuminen kasvaimeen (% ID/g kudosta) passiivisesti kohdennettuihin liposomeihin verrattuna. %ID/elin oli kohdennetuilla liposomeilla 5,9 % ja passiivisesti kohdennetuilla 5,4%. Todellinen ero oli siis pieni. InVivoScope:n ja gammalaskijan tulokset eivät korreloineet keskenään. Lisätutkimuksia ja menetelmän optimointia vaaditaan liposomien kohdennuksessa kasvaimeen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let G(V, E) be a simple, undirected graph where V is the set of vertices and E is the set of edges. A b-dimensional cube is a Cartesian product l(1) x l(2) x ... x l(b), where each l(i) is a closed interval of unit length on the real line. The cub/city of G, denoted by cub(G), is the minimum positive integer b such that the vertices in G can be mapped to axis parallel b-dimensional cubes in such a way that two vertices are adjacent in G if and only if their assigned cubes intersect. An interval graph is a graph that can be represented as the intersection of intervals on the real line-i.e. the vertices of an interval graph can be mapped to intervals on the real line such that two vertices are adjacent if and only if their corresponding intervals overlap. Suppose S(m) denotes a star graph on m+1 nodes. We define claw number psi(G) of the graph to be the largest positive integer m such that S(m) is an induced subgraph of G. It can be easily shown that the cubicity of any graph is at least log(2) psi(G)]. In this article, we show that for an interval graph G log(2) psi(G)-]<= cub(G)<=log(2) psi(G)]+2. It is not clear whether the upper bound of log(2) psi(G)]+2 is tight: till now we are unable to find any interval graph with cub(G)> (log(2)psi(G)]. We also show that for an interval graph G, cub(G) <= log(2) alpha], where alpha is the independence number of G. Therefore, in the special case of psi(G)=alpha, cub(G) is exactly log(2) alpha(2)]. The concept of cubicity can be generalized by considering boxes instead of cubes. A b-dimensional box is a Cartesian product l(1) x l(2) x ... x l(b), where each I is a closed interval on the real line. The boxicity of a graph, denoted box(G), is the minimum k such that G is the intersection graph of k-dimensional boxes. It is clear that box(G)<= cub(G). From the above result, it follows that for any graph G, cub(G) <= box(G)log(2) alpha]. (C) 2010 Wiley Periodicals, Inc. J Graph Theory 65: 323-333, 2010

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In voiced speech analysis epochal information is useful in accurate estimation of pitch periods and the frequency response of the vocal tract system. Ideally, linear prediction (LP) residual should give impulses at epochs. However, there are often ambiguities in the direct use of LP residual since samples of either polarity occur around epochs. Further, since the digital inverse filter does not compensate the phase response of the vocal tract system exactly, there is an uncertainty in the estimated epoch position. In this paper we present an interpretation of LP residual by considering the effect of the following factors: 1) the shape of glottal pulses, 2) inaccurate estimation of formants and bandwidths, 3) phase angles of formants at the instants of excitation, and 4) zeros in the vocal tract system. A method for the unambiguous identification of epochs from LP residual is then presented. The accuracy of the method is tested by comparing the results with the epochs obtained from the estimated glottal pulse shapes for several vowel segments. The method is used to identify the closed glottis interval for the estimation of the true frequency response of the vocal tract system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The history of software development in a somewhat systematical way has been performed for half a century. Despite this time period, serious failures in software development projects still occur. The pertinent mission of software project management is to continuously achieve more and more successful projects. The application of agile software methods and more recently the integration of Lean practices contribute to this trend of continuous improvement in the software industry. One such area warranting proper empirical evidence is the operational efficiency of projects. In the field of software development, Kanban as a process management method has gained momentum recently, mostly due to its linkages to Lean thinking. However, only a few empirical studies investigate the impacts of Kanban on projects in that particular area. The aim of this doctoral thesis is to improve the understanding of how Kanban impacts on software projects. The research is carried out in the area of Lean thinking, which contains a variety of concepts including Kanban. This article-type thesis conducts a set of case studies expanded with the research strategy of quasi-controlled experiment. The data-gathering techniques of interviews, questionnaires, and different types of observations are used to study the case projects, and thereby to understand the impacts of Kanban on software development projects. The research papers of the thesis are refereed, international journal and conference publications. The results highlight new findings regarding the application of Kanban in the software context. The key findings of the thesis suggest that Kanban is applicable to software development. Despite its several benefits reported in this thesis, the empirical evidence implies that Kanban is not all-encompassing but requires additional practices to keep development projects performing appropriately. Implications for research are given, as well. In addition to these findings, the thesis contributes in the area of plan-driven software development by suggesting implications both for research and practitioners. As a conclusion, Kanban can benefit software development projects but additional practices would increase its potential for the projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical engineering solutions like surgical simulators need High Performance Computing (HPC) to achieve real-time performance. Graphics Processing Units (GPUs) offer HPC capabilities at low cost and low power consumption. In this work, it is demonstrated that a liver which is discretized by about 2500 finite element nodes, can be graphically simulated in realtime, by making use of a GPU. Present work takes into consideration the time needed for the data transfer from CPU to GPU and back from GPU to CPU. Although behaviour of liver is very complicated, present computer simulation assumes linear elastostatics. One needs to use the commercial software ANSYS to obtain the global stiffness matrix of the liver. Results show that GPUs are useful for the real-time graphical simulation of liver, which in turn is needed in simulators that are used for training surgeons in laparoscopic surgery. Although the computer simulation should involve rendering also, neither rendering, nor the time needed for rendering and displaying the liver on a screen, is considered in the present work. The present work is just a demonstration of a concept; the concept is not really implemented and validated. Future work is to develop software which can accomplish real-time and very realistic graphical simulation of liver, with rendered image of liver on the screen changing in real-time according to the position of the surgical tool tip approximated as the mouse cursor in 3D.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose the first approximation for thickness of Quaternary sediment and late Quaternary early Tertiary topography for the part of lower reaches of Narmada valley in a systematic way using the shallow seismic method, that records both horizontal and vertical components of the microtremor (ambient noise) caused by natural processes. The measurements of microtremors were carried out at 31 sites spaced at a grid interval of 5 km s using Lennartz seismometer (5 s period) and City shark-II data acquisition system. The signals recorded were analysed for horizontal to the vertical (H/V) spectral ratio using GEOPSY software. For the present study, we concentrate on frequency range between 0.2 Hz and 10 Hz. The thickness of unconsolidated sediments at various sites is calculated based on non-linear regression equations proposed by Ibs-von Seht and Wohlenberg (1999) and Parolai et al. (2002). The estimated thickness is used to plot digital elevation model and cross profiles correlating with geomorphology and geology of the study area. (C) 2011 Elsevier Ltd. All rights reserved.