842 resultados para Computer Systems and Web Technologies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Partner University of Michigan convened a workshop on the Applications of Drifting Buoy Technologies for Coastal Watershed and Ecosystem Modeling in Ann Arbor, Michigan on June 5 to 7,2005. The objectives of the workshop were to: (1) educate potential users (managers and scientists) about the current capabilities and uses of drifting buoy technologies; (2) provide an opportunity for users (managers and scientists) to experience first hand the deployment and retrieval of various drifting buoys, as well as experience the capabilities of the buoys' technologies; (3) engage manufacturers with scientists and managers in discussions on drifting buoys' capabilities and their requirements to promote further applications of these systems; (4) promote a dialogue about realistic advantages and limitations of current drifting buoy technologies; and (5) develop a set of key recommendations for advancing both the capabilities and uses of drifting buoy technologies for coastal watershed and ecosystem modeling. To achieve these goals, representatives from research, academia, industry, and resource management were invited to participate in this workshop. Attendees obtained "hands on" experience as they participated in the deployment and retrieval of various drifting buoy systems on Big Portage Lake, a 644 acre lake northwest of Ann Arbor. Working groups then convened for discussions on current commercial usages and environmental monitoring approaches including; user requirements for drifting buoys, current status of drifting buoy systems and enabling technologies, and the challenges and strategies for bringing new drifting buoys "on-line". The following general recommendations were made to: 1). organize a testing program of drifting buoys for marketing their capabilities to resource managers and users. 2). develop a fact sheet to highlight the utility of drifting buoys. 3). facilitate technology transfer for advancements in drifter buoys that may be occurring through military funding and development in order to enhance their technical capability for environmental applications. (pdf contains 18 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the size of transistors approaching the sub-nanometer scale and Si-based photonics pinned at the micrometer scale due to the diffraction limit of light, we are unable to easily integrate the high transfer speeds of this comparably bulky technology with the increasingly smaller architecture of state-of-the-art processors. However, we find that we can bridge the gap between these two technologies by directly coupling electrons to photons through the use of dispersive metals in optics. Doing so allows us to access the surface electromagnetic wave excitations that arise at a metal/dielectric interface, a feature which both confines and enhances light in subwavelength dimensions - two promising characteristics for the development of integrated chip technology. This platform is known as plasmonics, and it allows us to design a broad range of complex metal/dielectric systems, all having different nanophotonic responses, but all originating from our ability to engineer the system surface plasmon resonances and interactions. In this thesis, we demonstrate how plasmonics can be used to develop coupled metal-dielectric systems to function as tunable plasmonic hole array color filters for CMOS image sensing, visible metamaterials composed of coupled negative-index plasmonic coaxial waveguides, and programmable plasmonic waveguide network systems to serve as color routers and logic devices at telecommunication wavelengths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A review article looking at the type of information requirements commonly shared by scientists and their use of traditional information services. Areas covered include primary requirements of IFE (Institute of Freshwater Ecology) staff, pure versus applied research, informal and personal sources of information, and traditional library and information services. It goes on to describe how research into information systems and technology may improve the wider accessibility and use of information to the scientific community. Technologies covered include online databases, telecommunications, gateways, expert systems, optical technology and applications of CDROM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sociomateriality has been attracting growing attention in the Organization Studies and Information Systems literatures since 2007, with more than 140 journal articles now referring to the concept. Over 80 percent of these articles have been published since January 2011 and almost all cite the work of Orlikowski (2007, 2010; Orlikowski and Scott 2008) as the source of the concept. Only a few, however, address all of the notions that Orlikowski suggests are entailed in sociomateriality, namely materiality, inseparability, relationality, performativity, and practices, with many employing the concept quite selectively. The contribution of sociomateriality to these literatures is, therefore, still unclear. Drawing on evidence from an ongoing study of the adoption of a computer-based clinical information system in a hospital critical care unit, this paper explores whether the notions, individually and collectively, offer a distinctive and coherent account of the relationship between the social and the material that may be useful in Information Systems research. It is argued that if sociomateriality is to be more than simply a label for research employing a number of loosely related existing theoretical approaches, then studies employing the concept need to pay greater attention to the notions entailed in it and to differences in their interpretation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years a number of sensing platforms are being investigated for their use in drug development, microanalysis or medical diagnosis. Lab-on-a-chip (LOC) are devices integrating more than one laboratory functions on a single device chip of a very small size, and typically consist of two main components: microfluidic handling systems and sensors. The physical mechanisms that are generally used for microfluidics and sensors are different, hence making the integration of these components difficult and costly. In this work we present a lab-on-a-chip system based on surface acoustic waves (for fluid manipulation) and film bulk acoustic resonators (for sensing). Coupling surface acoustic waves into liquids induces acoustic streaming and motion of micro-droplets, whilst it is well-known that bulk acoustic waves can be used to fabricate microgravimetric sensors. Both technologies offer exceptional sensitivity and can be fabricated from piezoelectric thin films deposited on Si substrates, reducing the fabrication time/cost of the LOC devices. © 2013 SPIE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The State Key Laboratory of Computer Science (SKLCS) is committed to basic research in computer science and software engineering. The research topics of the laboratory include: concurrency theory, theory and algorithms for real-time systems, formal specifications based on context-free grammars, semantics of programming languages, model checking, automated reasoning, logic programming, software testing, software process improvement, middleware technology, parallel algorithms and parallel software, computer graphics and human-computer interaction. This paper describes these topics in some detail and summarizes some results obtained in recent years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines how and why web server performance changes as the workload at the server varies. We measure the performance of a PC acting as a standalone web server, running Apache on top of Linux. We use two important tools to understand what aspects of software architecture and implementation determine performance at the server. The first is a tool that we developed, called WebMonitor, which measures activity and resource consumption, both in the operating system and in the web server. The second is the kernel profiling facility distributed as part of Linux. We vary the workload at the server along two important dimensions: the number of clients concurrently accessing the server, and the size of the documents stored on the server. Our results quantify and show how more clients and larger files stress the web server and operating system in different and surprising ways. Our results also show the importance of fixed costs (i.e., opening and closing TCP connections, and updating the server log) in determining web server performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced sensory systems address a number of major obstacles towards the provision for cost effective and proactive rehabilitation. Many of these systems employ technologies such as high-speed video or motion capture to generate quantitative measurements. However these solutions are accompanied by some major limitations including extensive set-up and calibration, restriction to indoor use, high cost and time consuming data analysis. Additionally many do not quantify improvement in a rigorous manner for example gait analysis for 5 minutes as opposed to 24 hour ambulatory monitoring. This work addresses these limitations using low cost, wearable wireless inertial measurement as a mobile and minimal infrastructure alternative. In cooperation with healthcare professionals the goal is to design and implement a reconfigurable and intelligent movement capture system. A key component of this work is an extensive benchmark comparison with the 'gold standard' VICON motion capture system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Buried heat sources can be investigated by examining thermal infrared images and comparing these with the results of theoretical models which predict the thermal anomaly a given heat source may generate. Key factors influencing surface temperature include the geometry and temperature of the heat source, the surface meteorological environment, and the thermal conductivity and anisotropy of the rock. In general, a geothermal heat flux of greater than 2% of solar insolation is required to produce a detectable thermal anomaly in a thermal infrared image. A heat source of, for example, 2-300K greater than the average surface temperature must be a t depth shallower than 50m for the detection of the anomaly in a thermal infrared image, for typical terrestrial conditions. Atmospheric factors are of critical importance. While the mean atmospheric temperature has little significance, the convection is a dominant factor, and can act to swamp the thermal signature entirely. Given a steady state heat source that produces a detectable thermal anomaly, it is possible to loosely constrain the physical properties of the heat source and surrounding rock, using the surface thermal anomaly as a basis. The success of this technique is highly dependent on the degree to which the physical properties of the host rock are known. Important parameters include the surface thermal properties and thermal conductivity of the rock. Modelling of transient thermal situations was carried out, to assess the effect of time dependant thermal fluxes. One-dimensional finite element models can be readily and accurately applied to the investigation of diurnal heat flow, as with thermal inertia models. Diurnal thermal models of environments on Earth, the Moon and Mars were carried out using finite elements and found to be consistent with published measurements. The heat flow from an injection of hot lava into a near surface lava tube was considered. While this approach was useful for study, and long term monitoring in inhospitable areas, it was found to have little hazard warning utility, as the time taken for the thermal energy to propagate to the surface in dry rock (several months) in very long. The resolution of the thermal infrared imaging system is an important factor. Presently available satellite based systems such as Landsat (resolution of 120m) are inadequate for detailed study of geothermal anomalies. Airborne systems, such as TIMS (variable resolution of 3-6m) are much more useful for discriminating small buried heat sources. Planned improvements in the resolution of satellite based systems will broaden the potential for application of the techniques developed in this thesis. It is important to note, however, that adequate spatial resolution is a necessary but not sufficient condition for successful application of these techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The demands of the process of engineering design, particularly for structural integrity, have exploited computational modelling techniques and software tools for decades. Frequently, the shape of structural components or assemblies is determined to optimise the flow distribution or heat transfer characteristics, and to ensure that the structural performance in service is adequate. From the perspective of computational modelling these activities are typically separated into: • fluid flow and the associated heat transfer analysis (possibly with chemical reactions), based upon Computational Fluid Dynamics (CFD) technology • structural analysis again possibly with heat transfer, based upon finite element analysis (FEA) techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to improve retrieval and navigation services on bibliographic data held in digital libraries. This paper presents the design and implementation of OntoBib¸ an ontology-based bibliographic database system that adopts ontology-driven search in its retrieval. The presented work exemplifies how a digital library of bibliographic data can be managed using Semantic Web technologies and how utilizing the domain specific knowledge improves both search efficiency and navigation of web information and document retrieval.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the rapid expansion of the internet and the increasing demand on Web servers, many techniques were developed to overcome the servers' hardware performance limitation. Mirrored Web Servers is one of the techniques used where a number of servers carrying the same "mirrored" set of services are deployed. Client access requests are then distributed over the set of mirrored servers to even up the load. In this paper we present a generic reference software architecture for load balancing over mirrored web servers. The architecture was designed adopting the latest NaSr architectural style [1] and described using the ADLARS [2] architecture description language. With minimal effort, different tailored product architectures can be generated from the reference architecture to serve different network protocols and server operating systems. An example product system is described and a sample Java implementation is presented.