853 resultados para Technology-based Firms
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
Purpose: To develop an improved mathematical model for the prediction of dose accuracy of Dosators - based upon the geometry of the machine in conjunction with measured flow properties of the powder. Methods: A mathematical model has been created, based on a analytical method of differential slices - incorporating measured flow properties. The key flow properties of interest in this investigation were: flow function, effective angle of wall friction, wall adhesion, bulk density, stress ratio K and permeability. To simulate the real process and (very importantly) validate the model, a Dosator test-rig has been used to measure the forces acting on the Dosator during the filling stage, the force required to eject the dose and the dose weight. Results: Preliminary results were obtained from the Dosator test-rig. Figure 1 [Omitted] shows the dose weight for different depths to the bottom of the powder bed at the end of the stroke and different levels of pre-compaction of the powder bed. A strong influence over dose weight arising from the proximity between the Dosator and the bottom of the powder bed at the end of the stroke and the conditions of the powder bed has been established. Conclusions: The model will provide a useful tool to predict dosing accuracy and, thus, optimise the future design of Dosator based equipment technology – based on measured bulk properties of the powder to be handled. Another important factor (with a significant influence) on Dosator processes, is the condition of the powder bed and the clearance between the Dosator and the bottom of the powder bed.
Resumo:
The objective of the study is to determine the psychometric properties of the Epistemological Beliefs Questionnaire on Mathematics. 171 Secondary School Mathematics Teachers of the Central Region of Cuba participated. The results show acceptable internal consistency. The factorial structure of the scale revealed three major factors, consistent with the Model of the Three Constructs: beliefs about knowledge, about learning and teaching. Irregular levels in the development of the epistemological belief system about mathematics of these teachers were shown, with a tendency among naivety and sophistication poles. In conclusion, the questionnaire is useful for evaluating teacher’s beliefs about mathematics.
Resumo:
University Science Park incubators (USIs) have emerged as a means by which Government, academia and business can develop high technology business firms (spin out HTBFs) from initial conception through to becoming established small firms, which are ready to move beyond the Science Park confines. Although there is considerable literature on how USIs can be improved and developed there is a paucity of studies, which explore how lifecycle development within HTBFs in USIs can affect how they use the unique resources and opportunities of the USI. Moreover, there is a focus on single point in time studies, which do not adequately investigate the longitudinal dynamics of HTBF lifecycle development within USIs. Therefore, the aim of this paper is to explore the longitudinal use of the unique resources of the USI by HTBFs at different lifecycle stages. The research methodology involved 18 HTBFs within two separate USIs. A series of longitudinal interviews and focus groups were conducted with HTBFs and USI staff over a 36-month period. NUD*IST software was used in developing the coding and analysis of transcripts. The results show that a HTBF's propensity to make effective use of the USI's resources and support increases as the lifecycle stage of the company increases and the small-firm searches for independence and autonomy. Therefore, further research is required to investigate the following two outstanding questions; firstly, which usage pattern is associated with the HTBF's ultimate success or failure in the marketplace? And secondly, are there any services missing from the observed array that the USI could provide to enhance the HTBF's degree of ultimate success? © 2007 Elsevier Ltd. All rights reserved.
Resumo:
Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.
Resumo:
In an effort to develop a novel electronic paper image display technology based on the electrowetting principle, a 3-D electrowetting cell is designed and fabricated, which consists of two 3-D bent electrodes, each having a horizontal surface made of gold and a vertical surface made of indium tin oxide (ITO) glass as a color display window, a layer of dielectric material on the 3-D electrodes, and a highly fluorinated hydrophobic layer on the surface of the dielectric layer. Results of this work show that an electrowetting-induced motion of an aqueous droplet in immiscible oils can be achieved reversibly across the boundary of the horizontal and vertical surfaces of the 3-D electrode surface. It is also shown that the droplet can maintain its wetting state on a vertical sidewall electrode free of a power supplier when the voltage is removed. This phenomenon may form the basis for color contrast modulation applications, where a power-free image display is required, such as electronic paper display technology in the future. (C) 2009 Society of Photo-Optical Instrumentation Engineers. [DOI: 10.1117/1.3100201]
Resumo:
Background The use of technology in healthcare settings is on the increase and may represent a cost-effective means of delivering rehabilitation. Reductions in treatment time, and delivery in the home, are also thought to be benefits of this approach. Children and adolescents with brain injury often experience deficits in memory and executive functioning that can negatively affect their school work, social lives, and future occupations. Effective interventions that can be delivered at home, without the need for high-cost clinical involvement, could provide a means to address a current lack of provision. We have systematically reviewed studies examining the effects of technology-based interventions for the rehabilitation of deficits in memory and executive functioning in children and adolescents with acquired brain injury. Objectives To assess the effects of technology-based interventions compared to placebo intervention, no treatment, or other types of intervention, on the executive functioning and memory of children and adolescents with acquired brain injury. Search methods We ran the search on the 30 September 2015. We searched the Cochrane Injuries Group Specialised Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Ovid MEDLINE(R), Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) Daily and Ovid OLDMEDLINE(R), EMBASE Classic + EMBASE (OvidSP), ISI Web of Science (SCI-EXPANDED, SSCI, CPCI-S, and CPSI-SSH), CINAHL Plus (EBSCO), two other databases, and clinical trials registers. We also searched the internet, screened reference lists, and contacted authors of included studies. Selection criteria Randomised controlled trials comparing the use of a technological aid for the rehabilitation of children and adolescents with memory or executive-functioning deficits with placebo, no treatment, or another intervention. Data collection and analysis Two review authors independently reviewed titles and abstracts identified by the search strategy. Following retrieval of full-text manuscripts, two review authors independently performed data extraction and assessed the risk of bias. Main results Four studies (involving 206 participants) met the inclusion criteria for this review. Three studies, involving 194 participants, assessed the effects of online interventions to target executive functioning (that is monitoring and changing behaviour, problem solving, planning, etc.). These studies, which were all conducted by the same research team, compared online interventions against a 'placebo' (participants were given internet resources on brain injury). The interventions were delivered in the family home with additional support or training, or both, from a psychologist or doctoral student. The fourth study investigated the use of a computer program to target memory in addition to components of executive functioning (that is attention, organisation, and problem solving). No information on the study setting was provided, however a speech-language pathologist, teacher, or occupational therapist accompanied participants. Two studies assessed adolescents and young adults with mild to severe traumatic brain injury (TBI), while the remaining two studies assessed children and adolescents with moderate to severe TBI. Risk of bias We assessed the risk of selection bias as low for three studies and unclear for one study. Allocation bias was high in two studies, unclear in one study, and low in one study. Only one study (n = 120) was able to conceal allocation from participants, therefore overall selection bias was assessed as high. One study took steps to conceal assessors from allocation (low risk of detection bias), while the other three did not do so (high risk of detection bias). Primary outcome 1: Executive functioning: Technology-based intervention versus placebo Results from meta-analysis of three studies (n = 194) comparing online interventions with a placebo for children and adolescents with TBI, favoured the intervention immediately post-treatment (standardised mean difference (SMD) -0.37, 95% confidence interval (CI) -0.66 to -0.09; P = 0.62; I2 = 0%). (As there is no 'gold standard' measure in the field, we have not translated the SMD back to any particular scale.) This result is thought to represent only a small to medium effect size (using Cohen’s rule of thumb, where 0.2 is a small effect, 0.5 a medium one, and 0.8 or above is a large effect); this is unlikely to have a clinically important effect on the participant. The fourth study (n = 12) reported differences between the intervention and control groups on problem solving (an important component of executive functioning). No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. The quality of evidence for this outcome according to GRADE was very low. This means future research is highly likely to change the estimate of effect. Primary outcome 2: Memory One small study (n = 12) reported a statistically significant difference in improvement in sentence recall between the intervention and control group following an eight-week remediation programme. No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. Secondary outcomes Two studies (n = 158) reported on anxiety/depression as measured by the Child Behavior Checklist (CBCL) and were included in a meta-analysis. We found no evidence of an effect with the intervention (mean difference -5.59, 95% CI -11.46 to 0.28; I2 = 53%). The GRADE quality of evidence for this outcome was very low, meaning future research is likely to change the estimate of effect. A single study sought to record adverse events and reported none. Two studies reported on use of the intervention (range 0 to 13 and 1 to 24 sessions). One study reported on social functioning/social competence and found no effect. The included studies reported no data for other secondary outcomes (that is quality of life and academic achievement). Authors' conclusions This review provides low-quality evidence for the use of technology-based interventions in the rehabilitation of executive functions and memory for children and adolescents with TBI. As all of the included studies contained relatively small numbers of participants (12 to 120), our findings should be interpreted with caution. The involvement of a clinician or therapist, rather than use of the technology, may have led to the success of these interventions. Future research should seek to replicate these findings with larger samples, in other regions, using ecologically valid outcome measures, and reduced clinician involvement.
Resumo:
As energias renováveis têm estado em destaque desde o fi nal do século XX. São vários os motivos para que isto esteja a acontecer. As previsões apontam para problemas de depleção das reservas de combustíveis fósseis, nomeadamente o petróleo e gás natural, durante o presente século. O carvão, ainda abundante, apresenta problemas ambientais signi cativos. Os perigos associados à energia nuclear estão fazer com que os governos de vários países repensem as suas políticas energéticas . Todas estas tecnologias têm fortes impactos ambientais. Considerando o conjunto das energias renováveis, a energia solar fotovoltaica tem ainda um peso menor no panorama da produção energética actual. A explicação para este facto deve-se ao custo, ainda elevado, dos sistemas fotovoltaicos. Várias iniciativas governamentais estão em curso, a SET for 2020 (UE) e a Sunshot (EUA), para o desenvolvimento de tecnologias que façam frente a este problema. A fatia de mercado que a tecnologia de filmes fi nos representa ainda é pequena, mas tem vindo a aumentar nos últimos anos. As vantagens relativamente à tecnologia tradicional baseada em Si são várias, como por ex. os custos energéticos e materiais para a fabricação das células. Esta dissertação apresenta um processo de fabricação de células solares em fi lmes finos usando como camada absorvente um novo composto semicondutor, o Cu2ZnSnS4, que apresenta como grande argumento, relativamente aos seus predecessores, o facto de ser constituído por elementos abundantes e de toxicidade reduzidas. Foi realizado um estudo sobre as condições termodinâmicas de crescimento deste composto, bem como a sua caracterização e das células solares finais. Este trabalho inclui um estudo dos compostos ternários, CuxSnSx+1 e compostos binários SnxSy, justi cado pelo facto de surgirem como fases secundárias no crescimento do Cu2ZnSnS4. Em seguida são descritos resumidamente os vários capítulos que constituem esta tese. No capítulo 1 é abordada de forma resumida a motivação e o enquadramento da tecnologia no panorama energético global. A estrutura da célula solar adoptada neste trabalho é também descrita. O capítulo 2 é reservado para uma descrição mais detalhada do composto Cu2ZnSnS4, nomeadamente as propriedades estruturais e opto-electrónicas. Estas últimas são usadas para explicar as composições não estequiométricas aplicadas no crescimento deste composto. São também descritas as várias técnicas de crescimento apresentadas na literatura. A última secção deste capítulo apresenta os resultados da caracterização publicados pelos vários grupos que estudam este composto. O método que foi implementado para crescer a camada absorvente, bem como os efeitos que a variação dos vários parâmetros têm neste processo são abordados no capítulo 3. Neste é também incluída uma descrição detalhada dos equipamentos usados na caraterização da camada absorvente e das células solares finais. As fases calcogêneas binária e ternárias são estudadas no capítulo 4. É apresentada uma descrição do método de crescimento, quer para as fases do tipo CuxSnSx+1, quer para as fases do tipo SnxSy e a sua caracterização básica, nomeadamente a sua composição e as propriedades estruturais, ópticas e eléctricas. No caso dos compostos binários são também apresentados os resultados de uma célula solar. No capítulo 5 são reportados os resultados da caracterização dos fi lmes de Cu2ZnSnS4. Técnicas como a dispersão Raman, a fotoluminescência, a efi ciência quântica externa e a espectroscopia de admitância são usadas para analisar as propriedades quer da camada absorvente quer da célula solar. No capítulo 6 é apresentada uma conclusão geral do trabalho desenvolvido e são referidas sugestões para melhorar e complementar os estudos feitos.
Resumo:
We have developed an in-house pipeline for the processing and analyses of sequence data generated during Illumina technology-based metagenomic studies of the human gut microbiota. Each component of the pipeline has been selected following comparative analysis of available tools; however, the modular nature of software facilitates replacement of any individual component with an alternative should a better tool become available in due course. The pipeline consists of quality analysis and trimming followed by taxonomic filtering of sequence data allowing reads associated with samples to be binned according to whether they represent human, prokaryotic (bacterial/archaeal), viral, parasite, fungal or plant DNA. Viral, parasite, fungal and plant DNA can be assigned to species level on a presence/absence basis, allowing – for example – identification of dietary intake of plant-based foodstuffs and their derivatives. Prokaryotic DNA is subject to taxonomic and functional analyses, with assignment to taxonomic hierarchies (kingdom, class, order, family, genus, species, strain/subspecies) and abundance determination. After de novo assembly of sequence reads, genes within samples are predicted and used to build a non-redundant catalogue of genes. From this catalogue, per-sample gene abundance can be determined after normalization of data based on gene length. Functional annotation of genes is achieved through mapping of gene clusters against KEGG proteins, and InterProScan. The pipeline is undergoing validation using the human faecal metagenomic data of Qin et al. (2014, Nature 513, 59–64). Outputs from the pipeline allow development of tools for the integration of metagenomic and metabolomic data, moving metagenomic studies beyond determination of gene richness and representation towards microbial-metabolite mapping. There is scope to improve the outputs from viral, parasite, fungal and plant DNA analyses, depending on the depth of sequencing associated with samples. The pipeline can easily be adapted for the analyses of environmental and non-human animal samples, and for use with data generated via non-Illumina sequencing platforms.
Resumo:
O trabalho consistiu no desenvolvimento e caracterização de sensores potenciométricos com base em polímeros de impressão molecular para a determinação de um antibiótico, a norfloxacina, em aquacultura. A simplicidade, o baixo custo e a interação rápida e reversível dos sensores potenciométricos com os analitos fizeram com que este fosse o tipo de sensor escolhido. O material sensor foi obtido por tecnologia de impressão molecular, baseada em polimerização em bulk, em que a NOR foi a molécula molde e foram utilizados como monómeros para autoconstrução dos sensores o pirrol, isoladamente, ou em conjunto com partículas de sílica gel funcionalizadas com 3-aminopropil. Também foi obtido material sensor, para controlo, em que a molécula molde NOR não estava presente (NIP). As características dos materiais sensores foram sujeitas a análise de microscopia eletrónica SEM e análise por espectrómetro de infravermelhos com transformada de Fourier. Os materiais sensores foram incluídos em membranas poliméricas, que seriam incorporadas em elétrodos. A avaliação do desempenho dos elétrodos foi feita através de curvas de calibração em diferentes meios (PBS, MES e HEPES). Também foi efetuada com sucesso a análise da sensibilidade dos elétrodos em água dopada. As diversas avaliações e análises efetuadas levaram a concluir que o MIP de pirrol com aditivo aniónico, foi o material sensor testado que permitiu obter melhores propriedades de resposta.
Resumo:
Software as a service (SaaS) is a service model in which the applications are accessible from various client devices through internet. Several studies report possible factors driving the adoption of SaaS but none have considered the perception of the SaaS features and the pressures existing in the organization’s environment. We propose an integrated research model that combines the process virtualization theory (PVT) and the institutional theory (INT). PVT seeks to explain whether SaaS processes are suitable for migration into virtual environments via an information technology-based mechanism. INT seeks to explain the effects of the institutionalized environment on the structure and actions of the organization. The research makes three contributions. First, it addresses a gap in the SaaS adoption literature by studying the internal perception of the technical features of SaaS and external coercive, normative, and mimetic pressures faced by an organization. Second, it empirically tests many of the propositions of PVT and INT in the SaaS context, thereby helping to determine how the theory operates in practice. Third, the integration of PVT and INT contributes to the information system (IS) discipline, deepening the applicability and strengths of these theories.
Resumo:
L’amélioration de la qualité de l’utilisation des médicaments dans les soins primaires est devenue un enjeu crucial. Les pharmaciens communautaires se présentent comme des acteurs centraux dans l’atteinte de cet objectif, en réclamant une extension de leur rôle. L’objectif principal de cette thèse est de mieux comprendre comment les technologies de prescription informatisée (eRx) influencent la transformation du rôle des pharmaciens communautaires. Le premier article présente les résultats d’une étude de cas qui aborde la transformation du rôle des pharmaciens communautaires à partir du concept de professionnalisation. Elle propose un modèle logique des influences d’une technologie de eRx sur cette professionnalisation, élaboré à partir de la typologie de Davenport. Ce modèle logique a été validé en interviewant douze pharmaciens communautaires participant à un projet pilote typique de technologie de eRx. A partir des perceptions des pharmaciens communautaires, nous avons établi que la technologie était susceptible de soutenir la professionnalisation des pharmaciens en passant par cinq mécanismes : la capacité analytique, l’élimination des intermédiaires, l’intégration, l’automatisation et la diffusion des connaissances. Le deuxième article analyse les perturbations induites par les différentes fonctions des technologies de eRx sur la stabilité de la juridiction des pharmaciens communautaires, en se basant sur un cadre de référence adapté d’Abbott. À partir de trente-trois entrevues, avec des praticiens (médecins et pharmaciens) et des élites, cette étude de cas a permis de décrire en détail les influences des différentes fonctions sur les modalités d’action des professionnels, ainsi que les enjeux soulevés par ces possibilités. La perturbation principale est liée aux changements dans la distribution des informations, ce qui influence les activités de diagnostic et d’inférence des professionnels. La technologie peut redistribuer les informations relatives à la gestion des médicaments autant au bénéfice des médecins qu’au bénéfice des pharmaciens, ce qui suscite des tensions entre les médecins et les pharmaciens, mais aussi parmi les pharmaciens. Le troisième article présente une revue systématique visant à faire une synthèse des études ayant évalué les effets des technologies de eRx de deuxième génération sur la gestion des médicaments dans les soins primaires. Cette revue regroupe dix-neuf études menées avec des méthodes observationnelles. Les résultats rapportés révèlent que les technologies sont très hétérogènes, le plus souvent immatures, et que les effets ont été peu étudiés au-delà des perceptions des utilisateurs, qui sont mitigées. Le seul effet positif démontré est une amélioration de la qualité du profil pharmacologique accessible aux professionnels, alors que des effets négatifs ont été démontrés au niveau de l’exécution des prescriptions, tels que l’augmentation du nombre d’appels de clarification du pharmacien au prescripteur. Il semble donc que l’on en connaisse peu sur les effets des technologies de eRx de deuxième génération. Ces trois études permettent de constater que les nouvelles technologies de eRx peuvent effectivement influencer la transformation du rôle du pharmacien communautaire en perturbant les caractéristiques des prescriptions, et surtout, l’information et sa distribution. Ces perturbations génèrent des possibilités pour une extension du rôle des pharmaciens communautaires, tout en soulignant les défis intra et interprofessionnels associés à l’actualisation de ces possibilités. Dans l’ensemble, nos résultats soulignent que les perturbations associées aux technologies de eRx dépassent les éléments techniques du travail des utilisateurs, pour englober de multiples perturbations quant à la nature même du travail et du rôle des professionnels. Les décideurs et acteurs impliqués dans le déploiement des technologies de eRx auraient avantage à prendre en compte l’ensemble de ces considérations pour rapprocher les effets observés des bénéfices promis de ces technologies.
Resumo:
L’adoption de la Loi concernant le cadre juridique des technologies de l’information en 2001 a permis de mettre en place un cadre juridique favorisant l’intégration des technologies de l’information dans le droit. Plus particulièrement en droit de la preuve, cela a conféré au document technologique la qualité d’élément de preuve. Dans ce contexte il a été nécessaire d’adapter certains articles du Code civil du Québec et du même fait certaines règles dont la règle de la meilleure preuve, telle que prévue à l’article 2860 C.c.Q.. Cette règle s’appuyait jusqu’à présent sur la notion d’original, notion propre au support papier dont il a fallu trouver un équivalent pour le document technologique. C’est ce qu’a fait la Loi en prévoyant à son article 12 les caractéristiques de l’original technologique. Nous nous penchons sur cette notion en regardant quelles sont ses origines et ses justifications, puis nous avons analysé l’article 12 de la Loi qui traite de l’original sous forme technologique. Enfin nous nous sommes interrogé sur la place des reproductions dans le contexte technologique et nous avons vu que celles-ci ont pris de plus en plus d’importance à côté du document original, au fur et à mesure du perfectionnement des moyens de reproduction.
Resumo:
L’entrée en vigueur de la Loi concernant le cadre juridique des technologies de l’information (ci-après la Loi), est la concrétisation de la prise en compte par le droit, de la preuve technologique. La notion de document technologique est à la fois centrale dans la Loi et dans le Code civil du Québec. Il s’est parfaitement intégré aux divers moyens de preuve du Code civil. Nous allons nous intéresser à cette notion qu’est le document technologique, mais davantage à ses éléments structurants, les métadonnées. Nous allons nous pencher sur la notion, ses origines et ses domaines de prédilection, faisant d’elles, un objet a priori essentiellement technologique, avant de les envisager dans un contexte de preuve. Nous allons voir quel potentiel probatoire les métadonnées représentent, à l’appui d’un document technologique. Enfin, nous nous interrogerons sur leur rôle probatoire autour des notions de copie-transfert et des obligations posées par la Loi, afin que ces deux modes de reproduction des document, puissent légalement tenir lieu du document original, soit la certification et la documentation.
Resumo:
Conventional floating gate non-volatile memories (NVMs) present critical issues for device scalability beyond the sub-90 nm node, such as gate length and tunnel oxide thickness reduction. Nanocrystalline germanium (nc-Ge) quantum dot flash memories are fully CMOS compatible technology based on discrete isolated charge storage nodules which have the potential of pushing further the scalability of conventional NVMs. Quantum dot memories offer lower operating voltages as compared to conventional floating-gate (FG) Flash memories due to thinner tunnel dielectrics which allow higher tunneling probabilities. The isolated charge nodules suppress charge loss through lateral paths, thereby achieving a superior charge retention time. Despite the considerable amount of efforts devoted to the study of nanocrystal Flash memories, the charge storage mechanism remains obscure. Interfacial defects of the nanocrystals seem to play a role in charge storage in recent studies, although storage in the nanocrystal conduction band by quantum confinement has been reported earlier. In this work, a single transistor memory structure with threshold voltage shift, Vth, exceeding ~1.5 V corresponding to interface charge trapping in nc-Ge, operating at 0.96 MV/cm, is presented. The trapping effect is eliminated when nc-Ge is synthesized in forming gas thus excluding the possibility of quantum confinement and Coulomb blockade effects. Through discharging kinetics, the model of deep level trap charge storage is confirmed. The trap energy level is dependent on the matrix which confines the nc-Ge.