899 resultados para Computer Networks and Communications
Resumo:
Second-order phase locked loops (PLLs) are devices that are able to provide synchronization between the nodes in a network even under severe quality restrictions in the signal propagation. Consequently, they are widely used in telecommunication and control. Conventional master-slave (M-S) clock-distribution systems are being, replaced by mutually connected (MC) ones due to their good potential to be used in new types of application such as wireless sensor networks, distributed computation and communication systems. Here, by using an analytical reasoning, a nonlinear algebraic system of equations is proposed to establish the existence conditions for the synchronous state in an MC PLL network. Numerical experiments confirm the analytical results and provide ideas about how the network parameters affect the reachability of the synchronous state. The phase-difference oscillation amplitudes are related to the node parameters helping to design PLL neural networks. Furthermore, estimation of the acquisition time depending on the node parameters allows the performance evaluation of time distribution systems and neural networks based on phase-locked techniques. (c) 2008 Elsevier GmbH. All rights reserved.
Resumo:
The design, construction, and characterization of a portable opto-coupled potentiostat are presented. The potentiostat is battery-powered, managed by a microcontroller, which implements cyclic voltammetry (CV) using suitable sensor electrodes. Its opto-coupling permits a wide range of current measurements, varying from mA to nA. Two software interfaces were developed to perform the CV measurement: a virtual instrument for a personal computer (PC) and a C-base interface for personal digital assistant (PDA). The potentiostat has been evaluated by detection of potassium ferrocyanide in KCl medium, both with macro and microelectrodes. There was good agreement between the instrumental results and those from commercial equipment.
Resumo:
A Geographic Information System (GIS) was used to model datasets of Leyte Island, the Philippines, to identify land which was suitable for a forest extension program on the island. The datasets were modelled to provide maps of the distance of land from cities and towns, land which was a suitable elevation and slope for smallholder forestry and land of various soil types. An expert group was used to assign numeric site suitabilities to the soil types and maps of site suitability were used to assist the selection of municipalities for the provision of extension assistance to smallholders. Modelling of the datasets was facilitated by recent developments of the ArcGIS® suite of computer programs and derivation of elevation and slope was assisted by the availability of digital elevation models (DEM) produced by the Shuttle Radar Topography (SRTM) mission. The usefulness of GIS software as a decision support tool for small-scale forestry extension programs is discussed.
Resumo:
Every day trillions of dollars circulate the globe in a digital data space and new forms of property and ownership emerge. Massive corporate entities with a global reach are formed and disappear with breathtaking speed, making and breaking personal fortunes the size of which defy imagination. Fictitious commodities abound. The genomes of entire nations have become corporately owned. Relationships have become the overt basis of economic wealth and political power. Hypercapitalism explores the problems of understanding this emergent form of global political economic organization by focusing on the internal relations between language, new media networks, and social perceptions of value. Taking an historical approach informed by Marx, Phil Graham draws upon writings in political economy, media studies, sociolinguistics, anthropology, and critical social science to understand the development, roots, and trajectory of the global system in which every possible aspect of human existence, including imagined futures, has become a commodity form.
Resumo:
The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).
Resumo:
Motivation: Prediction methods for identifying binding peptides could minimize the number of peptides required to be synthesized and assayed, and thereby facilitate the identification of potential T-cell epitopes. We developed a bioinformatic method for the prediction of peptide binding to MHC class II molecules. Results: Experimental binding data and expert knowledge of anchor positions and binding motifs were combined with an evolutionary algorithm (EA) and an artificial neural network (ANN): binding data extraction --> peptide alignment --> ANN training and classification. This method, termed PERUN, was implemented for the prediction of peptides that bind to HLA-DR4(B1*0401). The respective positive predictive values of PERUN predictions of high-, moderate-, low- and zero-affinity binder-a were assessed as 0.8, 0.7, 0.5 and 0.8 by cross-validation, and 1.0, 0.8, 0.3 and 0.7 by experimental binding. This illustrates the synergy between experimentation and computer modeling, and its application to the identification of potential immunotheraaeutic peptides.
Resumo:
Optical constants of AlSb, GaSb, and InSb are modeled in the 1-6 eV spectral range. We employ an extension of Adachi's model of the optical constants of semiconductors. The model takes into account transitions at E-0, E-0 + Delta(0), E-1, and E-1 + Delta(1) critical points, as well as higher-lying transitions which are modeled with three damped harmonic oscillators. We do not consider indirect transitions contribution, since it represents a second-order perturbation and its strength should be low. Also, we do not take into account excitonic effects at E-1, E-1 + Delta(1) critical points, since we model the room temperature data. In spite of fewer contributions to the dielectric function compared to previous calculations involving Adachi's model, our calculations show significantly improved agreement with the experimental data. This is due to the two main distinguishing features of calculations presented here: use of adjustable line broadening instead of the conventional Lorentzian one, and employment of a global optimization routine for model parameter determination.
Resumo:
This paper uses a unique new data set on manufacturing firms in Brazil and India to estimate production functions, augmented by information and communications technology (ICT). We find a strong positive association between ICT capital and productivity in both countries that is robust to several different specification tests. The paper also breaks new ground when using the Indian data to investigate the effect of the institutional and policy environment on ICT capital investment and productivity. We find that poorer infrastructure quality and labor market policy are associated with lower levels of ICT adoption, while poorer infrastructure is also associated with lower returns to investment.
Resumo:
This study examined the impact of computer and assistive device use on the employment status and vocational modes of people with physical disabilities in Australia. A survey was distributed to people over 15 years in age with physical disabilities living in the Brisbane area. Responses were received from 82 people, including those with spinal cord injuries, cerebral palsy and muscular dystrophy. Of respondents 46 were employed, 22 were unemployed, and 12 were either students or undertaking voluntary work. Three-quarters of respondents used a computer in their occupations, while 15 used assistive devices. Using logistic regression analysis it was found that gender, education, level of computer skill and computer training were significant predictors of employment outcomes. Neither the age of respondent nor use of assistive software were significant predictors. From information obtained in this study guidelines for a training programme designed to maximize the employability of people with physical disabilities were developed.
Resumo:
Over the past years, component-based software engineering has become an established paradigm in the area of complex software intensive systems. However, many techniques for analyzing these systems for critical properties currently do not make use of the component orientation. In particular, safety analysis of component-based systems is an open field of research. In this chapter we investigate the problems arising and define a set of requirements that apply when adapting the analysis of safety properties to a component-based software engineering process. Based on these requirements some important component-oriented safety evaluation approaches are examined and compared.
Specification, refinement and verification of concurrent systems: an integration of Object-Z and CSP