989 resultados para Software Acquisition
Resumo:
Available industrial energy meters offer high accuracy and reliability, but are typically expensive and low-bandwidth, making them poorly suited to multi-sensor data acquisition schemes and power quality analysis. An alternative measurement system is proposed in this paper that is highly modular, extensible and compact. To minimise cost, the device makes use of planar coreless PCB transformers to provide galvanic isolation for both power and data. Samples from multiple acquisition devices may be concentrated by a central processor before integration with existing host control systems. This paper focusses on the practical design and implementation of planar coreless PCB transformers to facilitate the module's isolated power, clock and data signal transfer. Calculations necessary to design coreless PCB transformers, and circuits designed for the transformer's practical application in the measurement module are presented. The designed transformer and each application circuit have been experimentally verified, with test data and conclusions made applicable to coreless PCB transformers in general.
Resumo:
This paper describes a software architecture for real-world robotic applications. We discuss issues of software reliability, testing and realistic off-line simulation that allows the majority of the automation system to be tested off-line in the laboratory before deployment in the field. A recent project, the automation of a very large mining machine is used to illustrate the discussion.
Resumo:
The effect of 18 months of training on the ovarian hormone concentrations and bone mineral density (BMD) accrual was assessed longitudinally in 14 adolescent rowers and 10 matched controls, aged 14–15 years. Ovarian hormone levels were assessed by urinary estrone glucuronide (E1G) and pregnanediol glucuronide (PdG) excretion rates, classifying the menstrual cycles as ovulatory or anovulatory. Total body (TB), total proximal femur (PF), femoral neck (FN) and lumbar spine (LS) (L2–4) bone mass were measured at baseline and 18 months using dual-energy X-ray densitometry. Results were expressed as bone mineral content (BMC), BMD and bone mineral apparent density (BMAD). Five rowers had anovulatory menstrual cycles compared with zero prevalence for the control subjects. Baseline TB BMD was significantly higher in the ovulatory rowers, with PF BMD, FN BMD and LS BMD similar for all groups. At completion, the LS bone accrual of the ovulatory rowers was significantly greater (BMC 8.1%, BMD 6.2%, BMAD 6.2%) than that of the anovulatory rowers (BMC 1.1%, BMD 3.9%, BMAD 1.6%) and ovulatory controls (BMC 0.5%, BMD 1.1%, BMAD 1.1%). No difference in TB, PF or FN bone accrual was observed among groups. This study demonstrated an osteogenic response to mechanical loading, with the rowers accruing greater bone mass than the controls at the lumbar spine. However, the exercise-induced osteogenic benefits were less when rowing training was associated with low estrogen and progesterone metabolite excretion.
Resumo:
Urinary tract infections (UTIs) are among the most common infectious diseases of humans, with Escherichia coli responsible for >80% of all cases. One extreme of UTI is asymptomatic bacteriuria (ABU), which occurs as an asymptomatic carrier state that resembles commensalism. To understand the evolution and molecular mechanisms that underpin ABU, the genome of the ABU E. coli strain VR50 was sequenced. Analysis of the complete genome indicated that it most resembles E. coli K-12, with the addition of a 94-kb genomic island (GI-VR50-pheV), eight prophages, and multiple plasmids. GI-VR50-pheV has a mosaic structure and contains genes encoding a number of UTI-associated virulence factors, namely, Afa (afimbrial adhesin), two autotransporter proteins (Ag43 and Sat), and aerobactin. We demonstrated that the presence of this island in VR50 confers its ability to colonize the murine bladder, as a VR50 mutant with GI-VR50-pheV deleted was attenuated in a mouse model of UTI in vivo. We established that Afa is the island-encoded factor responsible for this phenotype using two independent deletion (Afa operon and AfaE adhesin) mutants. E. coli VR50afa and VR50afaE displayed significantly decreased ability to adhere to human bladder epithelial cells. In the mouse model of UTI, VR50afa and VR50afaE displayed reduced bladder colonization compared to wild-type VR50, similar to the colonization level of the GI-VR50-pheV mutant. Our study suggests that E. coli VR50 is a commensal-like strain that has acquired fitness factors that facilitate colonization of the human bladder.
Resumo:
In 2005, Ginger Myles and Hongxia Jin proposed a software watermarking scheme based on converting jump instructions or unconditional branch statements (UBSs) by calls to a fingerprint branch function (FBF) that computes the correct target address of the UBS as a function of the generated fingerprint and integrity check. If the program is tampered with, the fingerprint and integrity checks change and the target address will not be computed correctly. In this paper, we present an attack based on tracking stack pointer modifications to break the scheme and provide implementation details. The key element of the attack is to remove the fingerprint and integrity check generating code from the program after disassociating the target address from the fingerprint and integrity value. Using the debugging tools that give vast control to the attacker to track stack pointer operations, we perform both subtractive and watermark replacement attacks. The major steps in the attack are automated resulting in a fast and low-cost attack.
Resumo:
Outlines some of the potential risks or actual harms that result from large-scale land leases or acquisitions and the relevant human rights and environmental law principles.
Resumo:
In two fMRI experiments, participants named pictures with superimposed distractors that were high or low in frequency or varied in terms of age of acquisition. Pictures superimposed with low-frequency words were named more slowly than those superimposed with high-frequency words, and late-acquired words interfered with picture naming to a greater extent than early-acquired words. The distractor frequency effect (Experiment 1) was associated with increased activity in left premotor and posterior superior temporal cortices, consistent with the operation of an articulatory response buffer and verbal selfmonitoring system. Conversely, the distractor age-of-acquisition effect (Experiment 2) was associated with increased activity in the left middle and posterior middle temporal cortex, consistent with the operation of lexical level processes such as lemma and phonological word form retrieval. The spatially dissociated patterns of activity across the two experiments indicate that distractor effects in picture-word interference may occur at lexical or postlexical levels of processing in speech production.
Resumo:
Metabolic imaging using positron emission tomography (PET) has found increasing clinical use for the management of infiltrating tumours such as glioma. However, the heterogeneous biological nature of tumours and intrinsic treatment resistance in some regions means that knowledge of multiple biological factors is needed for effective treatment planning. For example, the use of 18F-FDOPA to identify infiltrative tumour and 18F-FMISO for localizing hypoxic regions. Performing multiple PET acquisitions is impractical in many clinical settings, but previous studies suggest multiplexed PET imaging could be viable. The fidelity of the two signals is affected by the injection interval, scan timing and injected dose. The contribution of this work is to propose a framework to explicitly trade-off signal fidelity with logistical constraints when designing the imaging protocol. The particular case of estimating 18F-FMISO from a single frame prior to injection of 18F-FDOPA is considered. Theoretical experiments using simulations for typical biological scenarios in humans demonstrate that results comparable to a pair of single-tracer acquisitions can be obtained provided protocol timings are carefully selected. These results were validated using a pre-clinical data set that was synthetically multiplexed. The results indicate that the dual acquisition of 18F-FMISO and 18F-FDOPA could be feasible in the clinical setting. The proposed framework could also be used to design protocols for other tracers.
Resumo:
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.
Resumo:
This research explored how small and medium enterprises can achieve success with software as a service (SaaS) applications from cloud. Based upon an empirical investigation of six growth oriented and early technology adopting small and medium enterprises, this study proposes a SaaS for small and medium enterprise success model with two approaches: one for basic and one for advanced benefits. The basic model explains the effective use of SaaS for achieving informational and transactional benefits. The advanced model explains the enhanced use of software as a service for achieving strategic and transformational benefits. Both models explicate the information systems capabilities and organizational complementarities needed for achieving success with SaaS.
Resumo:
Bug fixing is a highly cooperative work activity where developers, testers, product managers and other stake-holders collaborate using a bug tracking system. In the context of Global Software Development (GSD), where software development is distributed across different geographical locations, we focus on understanding the role of bug trackers in supporting software bug fixing activities. We carried out a small-scale ethnographic fieldwork in a software product team distributed between Finland and India at a multinational engineering company. Using semi-structured interviews and in-situ observations of 16 bug cases, we show that the bug tracker 1) supported information needs of different stake holder, 2) established common-ground, and 3) reinforced issues related to ownership, performance and power. Consequently, we provide implications for design around these findings.
Resumo:
This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.
Resumo:
The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.
Resumo:
Pro gradu- tutkielmassani tarkastelen suomi-ranska kaksikielisyyden kehittymistä perheissä, joissa vanhemmilla on eri äidinkieli. Työni tavoitteena on ollut tutkia kuinka eri ympäristötekijät vaikuttavat kaksikielisyyden omaksumiseen ja miten perheiden erilainen panostus vähemmistökielen, ts. kielen joka ei esiinny ympäristössä, oppimiseen näkyy saavutetuissa tuloksissa. Tutkimukseeni osallistui 13 perhettä, joilla on 10-12 vuotiaita, ranskaa ja suomea päivittäin käyttäviä lapsia. Lapsia oli yhteensä 18. Voidakseni tarkastella myös kieliympäristön vaikutusta oppimiseen valittiin perheistä kuusi Suomesta ja seitsemän Ranskasta sekä Sveitsin ranskankieliseltä alueelta. Tutkimusmenetelmiini kuului vanhempien haastattelu perheen sosiolingvististen tekijöiden selville saamiseksi ja lasten kanssa keskustelu suullisen kielitaidon arvioimiseksi. Pääpaino kielitaidon arvioinnissa oli kuitenkin kirjallisella tekstillä, jonka lapset tuottivat molemmilla kielillä tekstittömän kirjan kuvien perusteella. Teksteistä suoritettiin virheanalyysit, joissa eri virheet jaettiin ortografisiin, semanttisiin ja kieliopillisiin virheisiin. Jokaiselle lapselle lasketiin myös keskiarvo, joka osoitti kuinka monta sanaa tekstissä oli jokaista virhettä kohti. Näiden keskiarvojen pohjalta tutkittiin yhteneväisyyksiä virhemäärien sekä perheiden sosiolingvististen tekijöiden kesken. Yhteenvedossa verrattiin myös tuloksia teoriaosassa esitettyihin kielitieteilijöiden tarjoamiin periaatteisiin. Tutkielman perusteella voidaan todeta, että ympäristön vaikutus näytetään usein aliarvioitaneen kaksikielisyyttä koskevissa teoksissa. Hyvään kielitaitoon vähemmistökielessä tarvitaan enemmän kuin yksi kieli - yksi henkilö menetelmä, jossa vanhemmat puhuvat lapselle omaa äidinkieltään. Hyviksi vahvistuskeinoiksi havaittiin varsinkin kaksikielinen koulu sekä useat vierailut toisen vanhemman kotimaahan. Varsinkin perheen nuorimpien lasten vähemmistökielen oppimiseen tulisi panostaa sillä näillä on syntymästään asti mahdollisuus käyttää enemmistökieltä myös vanhempien sisarusten kanssa. Kieliympäristön vaikutuksesta havaittiin, että Suomessa asuvat lapset hallitsivat yleisesti ottaen paremmin vähemmistökielensä kuin Ranskassa asuvat. Tähän pidettiin syynä ranskalais-suomalaisen koulun positiivista vaikutusta kielen oppimiselle sekä ranskankielen arvostettua asemaa Suomessa. Avainsanat: Kaksikielisyys, kieltenoppiminen, bilinguisme, acquisition des langues, couple mixte
Resumo:
Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.