388 resultados para HIPERPAV II (Computer file)
Resumo:
This paper addresses the development of trust in the use of Open Data through incorporation of appropriate authentication and integrity parameters for use by end user Open Data application developers in an architecture for trustworthy Open Data Services. The advantages of this architecture scheme is that it is far more scalable, not another certificate-based hierarchy that has problems with certificate revocation management. With the use of a Public File, if the key is compromised: it is a simple matter of the single responsible entity replacing the key pair with a new one and re-performing the data file signing process. Under this proposed architecture, the the Open Data environment does not interfere with the internal security schemes that might be employed by the entity. However, this architecture incorporates, when needed, parameters from the entity, e.g. person who authorized publishing as Open Data, at the time that datasets are created/added.
Resumo:
The control of environmental factors in open-office environments, such as lighting and temperature is becoming increasingly automated. This development means that office inhabitants are losing the ability to manually adjust environmental conditions according to their needs. In this paper we describe the design, use and evaluation of MiniOrb, a system that employs ambient and tangible interaction mechanisms to allow inhabitants of office environments to maintain awareness of environmental factors, report on their own subjectively perceived office comfort levels and see how these compare to group average preferences. The system is complemented by a mobile application, which enables users to see and set the same sensor values and preferences, but using a screen-based interface. We give an account of the system’s design and outline the results of an in-situ trial and user study. Our results show that devices that combine ambient and tangible interaction approaches are well suited to the task of recording indoor climate preferences and afford a rich set of possible interactions that can complement those enabled by more conventional screen-based interfaces.
Resumo:
The monoanionic ligand 1,1,3,3 tetracyano-2 ethoxypropenide (tcnoet) is reported with its Cu(II)–bpy complex of formula [Cu2(µ-tcnoet)2(tcnoet)2(bpy)2]. The structure has been determined using X-ray diffraction and features an alternating chain with bridging tcnoet ligands. One ligand acts as a bidentate, dinucleating ligand with one short Cu–N and one medium Cu–N bond, whereas the other tcnoet is largely monodentate, albeit with a very weak interdimer Cu–N bond. Despite the arrangement in dinuclear units, further arranged into linear chains through the non-bridging tcnoet ligand, the compound shows no significant magnetic exchange, as deduced from magnetic susceptibility down to 4 K. Ligand-field, IR and EPR spectra in the solid state and in frozen solution are reported and are consistent with the overall structure.
Resumo:
Recently, studies have identified high zinc levels in various environmental resources, and excessive intake of zinc has long been considered to be harmful to human health. The aim of this research was to investigate the effectiveness of tricalcium aluminate (C3A) as a removal agent of zinc from aqueous solution. Inductively coupled plasma-atomic emission spectrometer (ICP-AES), X-ray diffraction (XRD) and scanning electron microscopy (SEM) have been used to characterize such removal behavior. The effects of various factors such as pH influence, temperature and contact time were investigated. The adsorption capacity of C3A for Zn2+ was computed to be up to 13.73 mmol g−1, and the highest zinc removal capacity was obtained when the initial pH of Zn(NO3)2 solution was between 6.0 and 7.0, with temperature around 308 K. The XRD analysis showed that the resultant products were ZnAl-LDHs. Combined with the analysis of solution component, it was proved the existence of both precipitation and cation exchange in the removal process. From the experimental results, it was clear that C3A could be potentially used as a cost-effective material for the removal of zinc in aqueous environment.
Resumo:
PURPOSE Brivanib, an oral, multi-targeted tyrosine kinase inhibitor with activity against vascular endothelial growth factor (VEGF) and fibroblast growth factor receptor (FGFR) was investigated as a single agent in a phase II trial to assess the activity and tolerability in recurrent or persistent endometrial cancer (EMC). PATIENTS AND METHODS Eligible patients had persistent or recurrent EMC after receiving one to two prior cytotoxic regimens, measurable disease, and performance status of ≤2. Treatment consisted of brivanib 800 mg orally every day until disease progression or prohibitive toxicity. Primary endpoints were progression-free survival (PFS) at six months and objective tumor response. Expression of multiple angiogenic proteins and FGFR2 mutation status was assessed. RESULTS Forty-five patients were enrolled. Forty-three patients were eligible and evaluable. Median age was 64 years. Twenty-four patients (55.8%) received prior radiation. Median number of cycles was two (range 1-24). No GI perforations but one rectal fistula were seen. Nine patients had grade 3 hypertension, with one experiencing grade 4 confusion. Eight patients (18.6%; 90% CI 9.6%-31.7%) had responses (one CR and seven PRs), and 13 patients (30.2%; 90% CI 18.9%-43.9%) were PFS at six months. Median PFS and overall survival (OS) were 3.3 and 10.7 months, respectively. When modeled jointly, VEGF and angiopoietin-2 expression may diametrically predict PFS. Estrogen receptor-α (ER) expression was positively correlated with OS. CONCLUSION Brivanib is reasonably well tolerated and worthy of further investigation based on PFS at six months in recurrent or persistent EMC.
Resumo:
Emergency Response Teams increasingly use interactive technology to help manage information and communications. The challenge is to maintain a high situation awareness for different interactive devices sizes. This research specifically compared a handheld interactive device in the form of an iPad with a large interactive multi-touch tabletop. A search and rescue inspired simulator was designed to test operator situation awareness for the two sized devices. The results show that operators had better situation awareness on the tabletop device when the operation related to detecting of moving targets, searching target locations, distinguishing target types, and comprehending displayed information.
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
A description of a computer program to analyse cine angiograms of the heart and pressure waveforms to calculate valve gradients.
Resumo:
Hyperthermia, raised temperature, has been used as a means of treating cancer for centuries. Hippocrates (400 BC) and Galen (200 BC) used red-hot irons to treat small tumours. Much later, after the Renaissance, there are many reports of spontaneous tumour regression in patients with fevers produced by erysipelas, malaria, smallpox, tuberculosis and influenza. These illnesses produce fevers of about 40 °C which last for several days. Temperatures of at least 40 °C were found to be necessary for tumour regression. Towards the end of the nineteenth century pyrogenic bacteria were injected into patients with cancer. In 1896, Coly used a mixture of erysipelas and B. prodigeosus, with some success...
Resumo:
Density functional theory (DFT) calculations were performed to study the structural, mechanical, electrical, optical properties, and strain effects in single-layer sodium phosphidostannate(II) (NaSnP). We find the exfoliation of single-layer NaSnP from bulk form is highly feasible because the cleavage energy is comparable to graphite and MoS2. In addition, the breaking strain of the NaSnP monolayer is comparable to other widely studied 2D materials, indicating excellent mechanical flexibility of 2D NaSnP. Using the hybrid functional method, the calculated band gap of single-layer NaSnP is close to the ideal band gap of solar cell materials (1.5 eV), demonstrating great potential in future photovoltaic application. Furthermore, strain effect study shows that a moderate compression (2%) can trigger indirect-to-direct gap transition, which would enhance the ability of light absorption for the NaSnP monolayer. With sufficient compression (8%), the single-layer NaSnP can be tuned from semiconductor to metal, suggesting great applications in nanoelectronic devices based on strain engineering techniques.
Resumo:
In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.
Resumo:
This series of research vignettes is aimed at sharing current and interesting research findings from our team of international entrepreneurship researchers. This vignette, written by Professor Per Davidsson, examines new evidence on whether entrepreneurship education and training leads to more entrepreneurial action and success.
Resumo:
The evolution of technological systems is hindered by systemic components, referred to as reverse salients, which fail to deliver the necessary level of technological performance thereby inhibiting the performance delivery of the system as a whole. This paper develops a performance gap measure of reverse salience and applies this measurement in the study of the PC (personal computer) technological system, focusing on the evolutions of firstly the CPU (central processing unit) and PC game sub-systems, and secondly the GPU (graphics processing unit) and PC game sub-systems. The measurement of the temporal behavior of reverse salience indicates that the PC game sub-system is the reverse salient, continuously trailing behind the technological performance of the CPU and GPU sub-systems from 1996 through 2006. The technological performance of the PC game sub-system as a reverse salient trails that of the CPU sub-system by up to 2300 MHz with a gradually decreasing performance disparity in recent years. In contrast, the dynamics of the PC game sub-system as a reverse salient trails the GPU sub-system with an ever increasing performance gap throughout the timeframe of analysis. In addition, we further discuss the research and managerial implications of our findings.
Resumo:
We extended genetic linkage analysis - an analysis widely used in quantitative genetics - to 3D images to analyze single gene effects on brain fiber architecture. We collected 4 Tesla diffusion tensor images (DTI) and genotype data from 258 healthy adult twins and their non-twin siblings. After high-dimensional fluid registration, at each voxel we estimated the genetic linkage between the single nucleotide polymorphism (SNP), Val66Met (dbSNP number rs6265), of the BDNF gene (brain-derived neurotrophic factor) with fractional anisotropy (FA) derived from each subject's DTI scan, by fitting structural equation models (SEM) from quantitative genetics. We also examined how image filtering affects the effect sizes for genetic linkage by examining how the overall significance of voxelwise effects varied with respect to full width at half maximum (FWHM) of the Gaussian smoothing applied to the FA images. Raw FA maps with no smoothing yielded the greatest sensitivity to detect gene effects, when corrected for multiple comparisons using the false discovery rate (FDR) procedure. The BDNF polymorphism significantly contributed to the variation in FA in the posterior cingulate gyrus, where it accounted for around 90-95% of the total variance in FA. Our study generated the first maps to visualize the effect of the BDNF gene on brain fiber integrity, suggesting that common genetic variants may strongly determine white matter integrity.
Resumo:
Imaging genetics aims to discover how variants in the human genome influence brain measures derived from images. Genome-wide association scans (GWAS) can screen the genome for common differences in our DNA that relate to brain measures. In small samples, GWAS has low power as individual gene effects are weak and one must also correct for multiple comparisons across the genome and the image. Here we extend recent work on genetic clustering of images, to analyze surface-based models of anatomy using GWAS. We performed spherical harmonic analysis of hippocampal surfaces, automatically extracted from brain MRI scans of 1254 subjects. We clustered hippocampal surface regions with common genetic influences by examining genetic correlations (r(g)) between the normalized deformation values at all pairs of surface points. Using genetic correlations to cluster surface measures, we were able to boost effect sizes for genetic associations, compared to clustering with traditional phenotypic correlations using Pearson's r.