58 resultados para Computer Networks and Communications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

These are the full proceedings of the conference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Geographic Information System (GIS) was used to model datasets of Leyte Island, the Philippines, to identify land which was suitable for a forest extension program on the island. The datasets were modelled to provide maps of the distance of land from cities and towns, land which was a suitable elevation and slope for smallholder forestry and land of various soil types. An expert group was used to assign numeric site suitabilities to the soil types and maps of site suitability were used to assist the selection of municipalities for the provision of extension assistance to smallholders. Modelling of the datasets was facilitated by recent developments of the ArcGIS® suite of computer programs and derivation of elevation and slope was assisted by the availability of digital elevation models (DEM) produced by the Shuttle Radar Topography (SRTM) mission. The usefulness of GIS software as a decision support tool for small-scale forestry extension programs is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Every day trillions of dollars circulate the globe in a digital data space and new forms of property and ownership emerge. Massive corporate entities with a global reach are formed and disappear with breathtaking speed, making and breaking personal fortunes the size of which defy imagination. Fictitious commodities abound. The genomes of entire nations have become corporately owned. Relationships have become the overt basis of economic wealth and political power. Hypercapitalism explores the problems of understanding this emergent form of global political economic organization by focusing on the internal relations between language, new media networks, and social perceptions of value. Taking an historical approach informed by Marx, Phil Graham draws upon writings in political economy, media studies, sociolinguistics, anthropology, and critical social science to understand the development, roots, and trajectory of the global system in which every possible aspect of human existence, including imagined futures, has become a commodity form.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivation: Prediction methods for identifying binding peptides could minimize the number of peptides required to be synthesized and assayed, and thereby facilitate the identification of potential T-cell epitopes. We developed a bioinformatic method for the prediction of peptide binding to MHC class II molecules. Results: Experimental binding data and expert knowledge of anchor positions and binding motifs were combined with an evolutionary algorithm (EA) and an artificial neural network (ANN): binding data extraction --> peptide alignment --> ANN training and classification. This method, termed PERUN, was implemented for the prediction of peptides that bind to HLA-DR4(B1*0401). The respective positive predictive values of PERUN predictions of high-, moderate-, low- and zero-affinity binder-a were assessed as 0.8, 0.7, 0.5 and 0.8 by cross-validation, and 1.0, 0.8, 0.3 and 0.7 by experimental binding. This illustrates the synergy between experimentation and computer modeling, and its application to the identification of potential immunotheraaeutic peptides.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optical constants of AlSb, GaSb, and InSb are modeled in the 1-6 eV spectral range. We employ an extension of Adachi's model of the optical constants of semiconductors. The model takes into account transitions at E-0, E-0 + Delta(0), E-1, and E-1 + Delta(1) critical points, as well as higher-lying transitions which are modeled with three damped harmonic oscillators. We do not consider indirect transitions contribution, since it represents a second-order perturbation and its strength should be low. Also, we do not take into account excitonic effects at E-1, E-1 + Delta(1) critical points, since we model the room temperature data. In spite of fewer contributions to the dielectric function compared to previous calculations involving Adachi's model, our calculations show significantly improved agreement with the experimental data. This is due to the two main distinguishing features of calculations presented here: use of adjustable line broadening instead of the conventional Lorentzian one, and employment of a global optimization routine for model parameter determination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined the impact of computer and assistive device use on the employment status and vocational modes of people with physical disabilities in Australia. A survey was distributed to people over 15 years in age with physical disabilities living in the Brisbane area. Responses were received from 82 people, including those with spinal cord injuries, cerebral palsy and muscular dystrophy. Of respondents 46 were employed, 22 were unemployed, and 12 were either students or undertaking voluntary work. Three-quarters of respondents used a computer in their occupations, while 15 used assistive devices. Using logistic regression analysis it was found that gender, education, level of computer skill and computer training were significant predictors of employment outcomes. Neither the age of respondent nor use of assistive software were significant predictors. From information obtained in this study guidelines for a training programme designed to maximize the employability of people with physical disabilities were developed.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past years, component-based software engineering has become an established paradigm in the area of complex software intensive systems. However, many techniques for analyzing these systems for critical properties currently do not make use of the component orientation. In particular, safety analysis of component-based systems is an open field of research. In this chapter we investigate the problems arising and define a set of requirements that apply when adapting the analysis of safety properties to a component-based software engineering process. Based on these requirements some important component-oriented safety evaluation approaches are examined and compared.