838 resultados para Distributed Video Server
Resumo:
There is no agreement about the distinction between pathological, excessive and normal gaming. The present study compared two classifications for defining pathological gaming: the polythetic format (gamers who met at least half of the criteria) and monothetic format (gamers who met all criteria). Associations with mental, health and social issues were examined to assess differences between subgroups of gamers. A representative sample of 5,663 young Swiss men filled in a questionnaire as part of the ongoing Cohort Study on Substance Use Risk Factors (C-SURF). Game use was assessed with the Game Addiction Scale. Mental, social and physical factors (depression, anxiety, aggressiveness, physical and mental health, social and health consequences), gambling and substance use (illicit drug use, alcohol dependence and problematic cannabis use) were also assessed. The results indicated that monothetic gamers shared problems with polythetic gamers, but were even more inclined to mental health issues (depression, anxiety, and aggressiveness) and were more vulnerable to other dependencies like substance use, alcohol dependence or gambling. A second analysis using Latent Class Analysis confirmed the distinction between monothetic and polythetic gamers. These findings support the use of a monothetic format to diagnose pathological gaming and to differentiate it from excessive gaming.
Resumo:
BACKGROUND AND AIMS: Evidence-based and reliable measures of addictive disorders are needed in general population-based assessments. One study suggested that heavy use over time (UOT) should be used instead of self-reported addiction scales (AS). This study compared UOT and AS regarding video gaming and internet use empirically, using associations with comorbid factors. DESIGN: Cross-sectional data from the 2011 French Survey on Health and Consumption on Call-up and Preparation for Defence-Day (ESCAPAD), cross-sectional data from the 2012 Swiss ado@internet.ch study and two waves of longitudinal data (2010-13) of the Swiss Longitudinal Cohort Study on Substance Use Risk Factors (C-SURF). SETTING: Three representative samples from the general population of French and Swiss adolescents and young Swiss men, aged approximately 17, 14 and 20 years, respectively. PARTICIPANTS: ESCAPAD: n =22 945 (47.4% men); ado@internet.ch: n =3049 (50% men); C-SURF: n =4813 (baseline + follow-up, 100% men). MEASUREMENTS: We assessed video gaming/internet UOT ESCAPAD and ado@internet.ch: number of hours spent online per week, C-SURF: latent score of time spent gaming/using internet] and AS (ESCAPAD: Problematic Internet Use Questionnaire, ado@internet.ch: Internet Addiction Test, C-SURF: Gaming AS). Comorbidities were assessed with health outcomes (ESCAPAD: physical health evaluation with a single item, suicidal thoughts, and appointment with a psychiatrist; ado@internet.ch: WHO-5 and somatic health problems; C-SURF: Short Form 12 (SF-12 Health Survey) and Major Depression Inventory (MDI). FINDINGS: UOT and AS were correlated moderately (ESCAPAD: r = 0.40, ado@internet.ch: r = 0.53 and C-SURF: r = 0.51). Associations of AS with comorbidity factors were higher than those of UOT in cross-sectional (AS: .005 ≤ |b| ≤ 2.500, UOT: 0.001 ≤ |b| ≤ 1.000) and longitudinal analyses (AS: 0.093 ≤ |b| ≤ 1.079, UOT: 0.020 ≤ |b| ≤ 0.329). The results were similar across gender in ESCAPAD and ado@internet.ch (men: AS: 0.006 ≤ |b| ≤ 0.211, UOT: 0.001 ≤ |b| ≤ 0.061; women: AS: 0.004 ≤ |b| ≤ 0.155, UOT: 0.001 ≤ |b| ≤ 0.094). CONCLUSIONS: The measurement of heavy use over time captures part of addictive video gaming/internet use without overlapping to a large extent with the results of measuring by self-reported addiction scales (AS). Measuring addictive video gaming/internet use via self-reported addiction scales relates more strongly to comorbidity factors than heavy use over time.
Resumo:
Peer-reviewed
Resumo:
The activated sludge process - the main biological technology usually applied towastewater treatment plants (WWTP) - directly depends on live beings (microorganisms), and therefore on unforeseen changes produced by them. It could be possible to get a good plant operation if the supervisory control system is able to react to the changes and deviations in the system and can take thenecessary actions to restore the system’s performance. These decisions are oftenbased both on physical, chemical, microbiological principles (suitable to bemodelled by conventional control algorithms) and on some knowledge (suitable to be modelled by knowledge-based systems). But one of the key problems in knowledge-based control systems design is the development of an architecture able to manage efficiently the different elements of the process (integrated architecture), to learn from previous cases (spec@c experimental knowledge) and to acquire the domain knowledge (general expert knowledge). These problems increase when the process belongs to an ill-structured domain and is composed of several complex operational units. Therefore, an integrated and distributed AIarchitecture seems to be a good choice. This paper proposes an integrated and distributed supervisory multi-level architecture for the supervision of WWTP, that overcomes some of the main troubles of classical control techniques and those of knowledge-based systems applied to real world systems
Resumo:
Previous genetic studies have demonstrated that natal homing shapes the stock structure of marine turtle nesting populations. However, widespread sharing of common haplotypes based on short segments of the mitochondrial control region often limits resolution of the demographic connectivity of populations. Recent studies employing longer control region sequences to resolve haplotype sharing have focused on regional assessments of genetic structure and phylogeography. Here we synthesize available control region sequences for loggerhead turtles from the Mediterranean Sea, Atlantic, and western Indian Ocean basins. These data represent six of the nine globally significant regional management units (RMUs) for the species and include novel sequence data from Brazil, Cape Verde, South Africa and Oman. Genetic tests of differentiation among 42 rookeries represented by short sequences (380 bp haplotypes from 3,486 samples) and 40 rookeries represented by long sequences (~800 bp haplotypes from 3,434 samples) supported the distinction of the six RMUs analyzed as well as recognition of at least 18 demographically independent management units (MUs) with respect to female natal homing. A total of 59 haplotypes were resolved. These haplotypes belonged to two highly divergent global lineages, with haplogroup I represented primarily by CC-A1, CC-A4, and CC-A11 variants and haplogroup II represented by CC-A2 and derived variants. Geographic distribution patterns of haplogroup II haplotypes and the nested position of CC-A11.6 from Oman among the Atlantic haplotypes invoke recent colonization of the Indian Ocean from the Atlantic for both global lineages. The haplotypes we confirmed for western Indian Ocean RMUs allow reinterpretation of previous mixed stock analysis and further suggest that contemporary migratory connectivity between the Indian and Atlantic Oceans occurs on a broader scale than previously hypothesized. This study represents a valuable model for conducting comprehensive international cooperative data management and research in marine ecology.
Resumo:
Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.
Resumo:
Species structure and composition in Mediterranean riparian forests are determined by hydrological features, longitudinal zonation, and riverbank topography. This study assesses the distribution of four native riparian plants along the riverbank topographic gradient in three river stretches in southern Spain, with special emphasis on the occupation of adult and young feet of each species. The studied stretches suffered minimal human disturbances, displayed semi-arid conditions, and had wide riparian areas to allow the development of the target species: black alder (Alnus glutinosa), salvia leaf willow (Salix salviifolia), narrow-leafed ash (Fraxinus angustifolia), and oleander (Nerium oleander). Thalweg height was used to define the riverbank topographic gradient. The results showed a preferential zone for black alder and salvia leaf willow in the range of 0-150 cm from the channel thalweg, with adult alders and willows being more common between 51 and 150 cm and young alders being more common under 50 cm. Conversely, narrow-leafed ash and oleander were much more frequent, and showed greater development, in the ranges of 151-200 cm and 201-250 cm, respectively, whereas the young feet of both species covered the entire topographic range. Adult feet of the four species were spatially segregated along the riverbank topographic gradient, indicating their differential ability to cope with water stress from the non-tolerant alders and willows to more tolerant narrow-leafed ash trees and oleanders. Young feet, however, showed a strategy more closely linked to the initial availability of colonisation sites within riparian areas to the dispersion strategy of each species and to the distribution of adult feet. In Mediterranean areas, where riparian management has traditionally faced great challenges, the incorporation of species preferences along riverbank gradients could improve the performance of restoration projects.
Resumo:
This study examines Smart Grids and distributed generation, which is connected to a single-family house. The distributed generation comprises small wind power plant and solar panels. The study is done from the consumer point of view and it is divided into two parts. The first part presents the theoretical part and the second part presents the research part. The theoretical part consists of the definition of distributed generation, wind power, solar energy and Smart Grids. The study examines what the Smart Grids will enable. New technology concerning Smart Grids is also examined. The research part introduces wind and sun conditions from two countries. The countries are Finland and Germany. According to the wind and sun conditions of these two countries, the annual electricity production from wind power plant and solar panels will be calculated. The costs of generating electricity from wind and solar energy are calculated from the results of annual electricity productions. The study will also deal with feed-in tariffs, which are supporting systems for renewable energy resources. It is examined in the study, if it is cost-effective for the consumers to use the produced electricity by themselves or sell it to the grid. Finally, figures for both countries are formed. The figures include the calculated cost of generating electricity from wind power plant and solar panels, retail and wholesale prices and feed-in tariffs. In Finland, it is not cost-effective to sell the produced electricity to the grid, before there are support systems. In Germany, it is cost-effective to sell the produced electricity from solar panels to the grid because of feed-in tariffs. On the other hand, in Germany it is cost-effective to produce electricity from wind to own use because the retail price is higher than the produced electricity from wind.
Resumo:
In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms
Resumo:
Kuormitustestaus on osa web-sovellusten kehitystä ja käyttöönottoa. Sillä varmistetaan sovellusten toimivuus ennalta määrätyn kuorman alla. Microsoft Office SharePoint Server 2007 (MOSS) on palvelintuote nykyaikaisten web-sovellusten luontiin ja ylläpitoon. Työssä vertaillaan kahta eri uormitustestaustyökalua: SilkPerformer 2008 ja Visual Studio Team System 2008 Test Edition ja valitaan MOSS – web-sovelluksille paremmin sopiva työkalu. Työssä vertaillaan työkaluja niiden ominaisuuksien perusteella sekä suorittamalla kuormitustestausta testausta varten luodulle MOSS – web-sovellukselle. Vaikuttavien tekijöiden perusteella työkaluja arvioidaan ja tämän perusteella saadaan tulos vertailulle. Työn tuloksena Visual Studio Team System 2008 Test Edition sopii paremmin MOSS – web-sovelluksen kuormitustestausvälineeksi. Vertailussa kuitenkin havaittiin, että työkalut ovat melko tasavertaisia, ja käytännön tilanteesta riippuu, kumpi sopii paremmin. Tämä työ auttaa valinnan teossa.
Resumo:
Actualment un típic embedded system (ex. telèfon mòbil) requereix alta qualitat per portar a terme tasques com codificar/descodificar a temps real; han de consumir poc energia per funcionar hores o dies utilitzant bateries lleugeres; han de ser el suficientment flexibles per integrar múltiples aplicacions i estàndards en un sol aparell; han de ser dissenyats i verificats en un període de temps curt tot i l’augment de la complexitat. Els dissenyadors lluiten contra aquestes adversitats, que demanen noves innovacions en arquitectures i metodologies de disseny. Coarse-grained reconfigurable architectures (CGRAs) estan emergent com a candidats potencials per superar totes aquestes dificultats. Diferents tipus d’arquitectures han estat presentades en els últims anys. L’alta granularitat redueix molt el retard, l’àrea, el consum i el temps de configuració comparant amb les FPGAs. D’altra banda, en comparació amb els tradicionals processadors coarse-grained programables, els alts recursos computacionals els permet d’assolir un alt nivell de paral•lelisme i eficiència. No obstant, els CGRAs existents no estant sent aplicats principalment per les grans dificultats en la programació per arquitectures complexes. ADRES és una nova CGRA dissenyada per I’Interuniversity Micro-Electronics Center (IMEC). Combina un processador very-long instruction word (VLIW) i un coarse-grained array per tenir dues opcions diferents en un mateix dispositiu físic. Entre els seus avantatges destaquen l’alta qualitat, poca redundància en les comunicacions i la facilitat de programació. Finalment ADRES és un patró enlloc d’una arquitectura concreta. Amb l’ajuda del compilador DRESC (Dynamically Reconfigurable Embedded System Compile), és possible trobar millors arquitectures o arquitectures específiques segons l’aplicació. Aquest treball presenta la implementació d’un codificador MPEG-4 per l’ADRES. Mostra l’evolució del codi per obtenir una bona implementació per una arquitectura donada. També es presenten les característiques principals d’ADRES i el seu compilador (DRESC). Els objectius són de reduir al màxim el nombre de cicles (temps) per implementar el codificador de MPEG-4 i veure les diferents dificultats de treballar en l’entorn ADRES. Els resultats mostren que els cícles es redueixen en un 67% comparant el codi inicial i final en el mode VLIW i un 84% comparant el codi inicial en VLIW i el final en mode CGA.
Resumo:
The environmental impact of landfill is a growing concern in waste management practices. Thus, assessing the effectiveness of the solutions implemented to alter the issue is of importance. The objectives of the study were to provide an insight of landfill advantages, and to consolidate landfill gas importance among others alternative fuels. Finally, a case study examining the performances of energy production from a land disposal at Ylivieska was carried out to ascertain the viability of waste to energy project. Both qualitative and quantitative methods were applied. The study was conducted in two parts; the first was the review of literatures focused on landfill gas developments. Specific considerations were the conception of mechanism governing the variability of gas production and the investigation of mathematical models often used in landfill gas modeling. Furthermore, the analysis of two main distributed generation technologies used to generate energy from landfill was carried out. The review of literature revealed a high influence of waste segregation and high level of moisture content for waste stabilization process. It was found that the enhancement in accuracy for forecasting gas rate generation can be done with both mathematical modeling and field test measurements. The result of the case study mainly indicated the close dependence of the power output with the landfill gas quality and the fuel inlet pressure.
Resumo:
IT outsourcing refers to the way companies focus on their core competencies and buy the supporting functions from other companies specialized in that area. Service is the total outcome of numerous of activities by employees and other resources to provide solutions to customers' problems. Outsourcing and service business have their unique characteristics. Service Level Agreements quantify the minimum acceptable service to the user. The service quality has to be objectively quantified so that its achievement or non-achievement of it can be monitored. Usually offshoring refers to the transferring of tasks to low-cost nations. Offshoring presents a lot of challenges that require special attention and they need to be assessed thoroughly. IT Infrastructure management refers to installation and basic usability assistance of operating systems, network and server tools and utilities. ITIL defines the industry best practices for organizing IT processes. This thesis did an analysis of server operations service and the customers’ perception of the quality of daily operations. The agreed workflows and processes should be followed better. Service providers’ processes are thoroughly defined but both the customer and the service provider might disobey them. Service provider should review the workflows regarding customer functions. Customer facing functions require persistent skill development, as they communicate the quality to the customer. Service provider needs to provide better organized communication and knowledge exchange methods between the specialists in different geographical locations.
Resumo:
The development of software tools begun as the first computers were built. The current generation of development environments offers a common interface to access multiple software tools and often also provide a possibility to build custom tools as extensions to the existing development environment. Eclipse is an open source development environment that offers good starting point for developing custom extensions. This thesis presents a software tool to aid the development of context-aware applications on Multi-User Publishing Environment (MUPE) platform. The tool is implemented as an Eclipse plug-in. The tool allows developer to include external server side contexts to their MUPE applications. The tool allows additional context sources to be added through the Eclipse's extension point mechanism. The thesis describes how the tool was designed and implemented. The implementation consists of tool core component part and an additional context source extension part. Tool core component is responsible for the actual context addition and also provides the needed user interface elements to the Eclipse workbench. Context source component provides the needed context source related information to the core component. As part of the work an update site feature was also implemented for distributing the tool through Eclipse update mechanism.
Resumo:
Konferenssiesitelmä: PHOTOGRAPHY NEXT International Conference at Moderna museet and Nordiska Museet, Stockholm, 4-5 February, 2010