935 resultados para accessibility analysis tools
Resumo:
A tanulmány célja, hogy bemutassa a Magyarországon működő vállalatok gyakorlatát az ellátási lánc disztribúció oldalának menedzsmentje területén egy empirikus kutatás eredményeinek segítségével. A dolgozat két részből épül fel. Az első részben egy elméleti áttekintés olvasható azokról a menedzsment eszközökről, amelyeket a vállalatok disztribúciós folyamataik során alkalmazhatnak az ellátási láncban. A második rész az empirikus kutatás eredményeit mutatja be. A felmérés során 92 vállalat (amelyből az elemzésbe 79 volt ténylegesen bevonható) vett részt, és válaszaik és a statisztikai elemzés alapján kirajzolódik egy kép, hogy milyen mértékben alkalmazzák a disztribúciós lánc menedzsment eszközeit, valamint milyen fejlettségi szintek különböztethetők meg az alkalmazás volumene alapján. = Aim of the paper is to present the operational practice of Hungarian companies in managing the distribution side of the supply chain (the demand chain), with the help of the results of an empirical research. The paper consists of two parts. In the first part, a literature review is presented about the management tools which companies may use while managing their distribution processes in the supply chain. In the second part I introduce the results of the empirical research. The survey was participated by 92 companies (of which 79 could be analysed) and according to their responses and the statistical analyses, a picture was formulated about how intensely they use the demand chain management tools, how developed they are in the application of those.
Resumo:
Peer reviewed
Resumo:
The evolution of cellular systems towards third generation (3G) or IMT-2000 seems to have a tendency to use W-CDMA as the standard access method, as ETSI decisions have showed. However, there is a question about the improvements in capacity and the wellness of this access method. One of the aspects that worry developers and researchers planning the third generation is the extended use of the Internet and more and more bandwidth hungry applications. This work shows the performance of a W-CDMA system simulated in a PC using cover maps generated with DC-Cell, a GIS based planning tool developed by the Technical University of Valencia, Spain. The maps are exported to MATLAB and used in the model. The system used consists of several microcells in a downtown area. We analyse the interference from users in the same cell and in adjacent cells and the effect in the system, assuming perfect control for each cell. The traffic generated by the simulator is voice and data. This model allows us to work with coverage that is more accurate and is a good approach to analyse the multiple access interference (MAI) problem in microcellular systems with irregular coverage. Finally, we compare the results obtained, with the performance of a similar system using TDMA.
Resumo:
We show a simulation model for capacity analysis in mobile systems using a geographic information system (GIS) based tool, used for coverage calculations and frequency assignment, and MATLAB. The model was developed initially for “narrowband” CDMA and TDMA, but was modified for WCDMA. We show also some results for a specific case in “narrowband” CDMA
Resumo:
We explore the recently developed snapshot-based dynamic mode decomposition (DMD) technique, a matrix-free Arnoldi type method, to predict 3D linear global flow instabilities. We apply the DMD technique to flows confined in an L-shaped cavity and compare the resulting modes to their counterparts issued from classic, matrix forming, linear instability analysis (i.e. BiGlobal approach) and direct numerical simulations. Results show that the DMD technique, which uses snapshots generated by a 3D non-linear incompressible discontinuous Galerkin Navier?Stokes solver, provides very similar results to classical linear instability analysis techniques. In addition, we compare DMD results issued from non-linear and linearised Navier?Stokes solvers, showing that linearisation is not necessary (i.e. base flow not required) to obtain linear modes, as long as the analysis is restricted to the exponential growth regime, that is, flow regime governed by the linearised Navier?Stokes equations, and showing the potential of this type of analysis based on snapshots to general purpose CFD codes, without need of modifications. Finally, this work shows that the DMD technique can provide three-dimensional direct and adjoint modes through snapshots provided by the linearised and adjoint linearised Navier?Stokes equations advanced in time. Subsequently, these modes are used to provide structural sensitivity maps and sensitivity to base flow modification information for 3D flows and complex geometries, at an affordable computational cost. The information provided by the sensitivity study is used to modify the L-shaped geometry and control the most unstable 3D mode.
Resumo:
The emergence of mass spectrometry-based proteomics has revolutionized the study of proteins and their abundances, functions, interactions, and modifications. However, in a multicellular organism, it is difficult to monitor dynamic changes in protein synthesis in a specific cell type within its native environment. In this thesis, we describe methods that enable the metabolic labeling, purification, and analysis of proteins in specific cell types and during defined periods in live animals. We first engineered a eukaryotic phenylalanyl-tRNA synthetase (PheRS) to selectively recognize the unnatural L-phenylalanine analog p-azido-L-phenylalanine (Azf). Using Caenorhabditis elegans, we expressed the engineered PheRS in a cell type of choice (i.e. body wall muscles, intestinal epithelial cells, neurons, pharyngeal muscles), permitting proteins in those cells -- and only those cells -- to be labeled with azides. Labeled proteins are therefore subject to "click" conjugation to cyclooctyne-functionalized affnity probes, separation from the rest of the protein pool and identification by mass spectrometry. By coupling our methodology with heavy isotopic labeling, we successfully identified proteins -- including proteins with previously unknown expression patterns -- expressed in targeted subsets of cells. While cell types like body wall or pharyngeal muscles can be targeted with a single promoter, many cells cannot; spatiotemporal selectivity typically results from the combinatorial action of multiple regulators. To enhance spatiotemporal selectivity, we next developed a two-component system to drive overlapping -- but not identical -- patterns of expression of engineered PheRS, restricting labeling to cells that express both elements. Specifically, we developed a split-intein-based split-PheRS system for highly efficient PheRS-reconstitution through protein splicing. Together, these tools represent a powerful approach for unbiased discovery of proteins uniquely expressed in a subset of cells at specific developmental stages.
Resumo:
Changing or creating an organisation means creating a new process. Each process involves many risks that need to be identified and managed. The main risks considered here are procedural and legal risks. The former are related to the risks of errors that may occur during processes, while the latter are related to the compliance of processes with regulations. Managing the risks implies proposing changes to the processes that allow the desired result: an optimised process. In order to manage a company and optimise it in the best possible way, not only should the organisational aspect, risk management and legal compliance be taken into account, but it is important that they are all analysed simultaneously with the aim of finding the right balance that satisfies them all. This is the aim of this thesis, to provide methods and tools to balance these three characteristics, and to enable this type of optimisation, ICT support is used. This work isn’t a thesis in computer science or law, but rather an interdisciplinary thesis. Most of the work done so far is vertical and in a specific domain. The particularity and aim of this thesis is not to carry out an in-depth analysis of a particular aspect, but rather to combine several important aspects, normally analysed separately, which however have an impact and influence each other. In order to carry out this kind of interdisciplinary analysis, the knowledge base of both areas was involved and the combination and collaboration of different experts in the various fields was necessary. Although the methodology described is generic and can be applied to all sectors, the case study considered is a new type of healthcare service that allows patients in acute disease to be hospitalised to their home. This provide the possibility to perform experiments using real hospital database.
Resumo:
A utilização da web para a disponibilização de informações e serviços de órgãos governamentais para os cidadãos tem se tornado cada vez mais expressiva. Assim, a garantia de que esses conteúdos e serviços possam ser acessíveis a qualquer cidadão é imprescindível, independentemente de necessidades especiais ou de quaisquer outras barreiras. No Brasil, o Decreto-Lei nº5.296/2004 determinou que todos os órgãos governamentais deveriam adaptar seus sítios na web de acordo com critérios de acessibilidade até dezembro de 2005. Com o objetivo de verificar a evolução da acessibilidade ao longo dos anos e como foi o impacto dessa legislação, este artigo analisa a acessibilidade dos sítios dos governos estaduais brasileiros por meio de amostras coletadas entre 1996 e 2007. Foram efetuadas análises por meio de métricas, obtidas por avaliações com ferramentas automáticas. Os resultados indicam que a legislação teve pouco impacto para a melhoria real da acessibilidade dos sítios no período indicado, com uma melhora somente em 2007. Verifica-se que se faz necessário adotar políticas públicas mais efetivas para que as pessoas com necessidades especiais tenham os seus direitos para acesso a informações e aos serviços públicos na web assegurados mais amplamente.
Resumo:
This article describes the design, implementation, and experiences with AcMus, an open and integrated software platform for room acoustics research, which comprises tools for measurement, analysis, and simulation of rooms for music listening and production. Through use of affordable hardware, such as laptops, consumer audio interfaces and microphones, the software allows evaluation of relevant acoustical parameters with stable and consistent results, thus providing valuable information in the diagnosis of acoustical problems, as well as the possibility of simulating modifications in the room through analytical models. The system is open-source and based on a flexible and extensible Java plug-in framework, allowing for cross-platform portability, accessibility and experimentation, thus fostering collaboration of users, developers and researchers in the field of room acoustics.
Resumo:
OBJECTIVE: To estimate the spatial intensity of urban violence events using wavelet-based methods and emergency room data. METHODS: Information on victims attended at the emergency room of a public hospital in the city of São Paulo, Southeastern Brazil, from January 1, 2002 to January 11, 2003 were obtained from hospital records. The spatial distribution of 3,540 events was recorded and a uniform random procedure was used to allocate records with incomplete addresses. Point processes and wavelet analysis technique were used to estimate the spatial intensity, defined as the expected number of events by unit area. RESULTS: Of all georeferenced points, 59% were accidents and 40% were assaults. There is a non-homogeneous spatial distribution of the events with high concentration in two districts and three large avenues in the southern area of the city of São Paulo. CONCLUSIONS: Hospital records combined with methodological tools to estimate intensity of events are useful to study urban violence. The wavelet analysis is useful in the computation of the expected number of events and their respective confidence bands for any sub-region and, consequently, in the specification of risk estimates that could be used in decision-making processes for public policies.
Resumo:
Background: Tuberculosis is one of the most prominent health problems in the world, causing 1.75 million deaths each year. Rapid clinical diagnosis is important in patients who have comorbidities such as Human Immunodeficiency Virus (HIV) infection. Direct microscopy has low sensitivity and culture takes 3 to 6 weeks [1-3]. Therefore, new tools for TB diagnosis are necessary, especially in health settings with a high prevalence of HIV/TB co-infection. Methods: In a public reference TB/HIV hospital in Brazil, we compared the cost-effectiveness of diagnostic strategies for diagnosis of pulmonary TB: Acid fast bacilli smear microscopy by Ziehl-Neelsen staining (AFB smear) plus culture and AFB smear plus colorimetric test (PCR dot-blot). From May 2003 to May 2004, sputum was collected consecutively from PTB suspects attending the Parthenon Reference Hospital. Sputum samples were examined by AFB smear, culture, and PCR dot-blot. The gold standard was a positive culture combined with the definition of clinical PTB. Cost analysis included health services and patient costs. Results: The AFB smear plus PCR dot-blot require the lowest laboratory investment for equipment (US$ 20,000). The total screening costs are 3.8 times for AFB smear plus culture versus for AFB smear plus PCR dot blot costs (US$ 5,635,760 versus US$ 1,498, 660). Costs per correctly diagnosed case were US$ 50,773 and US$ 13,749 for AFB smear plus culture and AFB smear plus PCR dot-blot, respectively. AFB smear plus PCR dot-blot was more cost-effective than AFB smear plus culture, when the cost of treating all correctly diagnosed cases was considered. The cost of returning patients, which are not treated due to a negative result, to the health service, was higher in AFB smear plus culture than for AFB smear plus PCR dot-blot, US$ 374,778,045 and US$ 110,849,055, respectively. Conclusion: AFB smear associated with PCR dot-blot associated has the potential to be a cost-effective tool in the fight against PTB for patients attended in the TB/HIV reference hospital.
Resumo:
The aim of the study was to evaluate the possible relationships between stress tolerance, training load, banal infections and salivary parameters during 4 weeks of regular training in fifteen basketball players. The Daily Analysis of Life Demands for Athletes` questionnaire (sources and symptoms of stress) and the Wisconsin Upper Respiratory Symptom Survey were used on a weekly basis. Salivary cortisol and salivary immunoglobulin A (SIgA) were collected at the beginning (before) and after the study, and measured by enzyme-linked immunosorbent assay (ELISA). Ratings of perceived exertion (training load) were also obtained. The results from ANOVA with repeated measures showed greater training loads, number of upper respiratory tract infection episodes and negative sensation to both symptoms and sources of stress, at week 2 (p < 0.05). Significant increases in cortisol levels and decreases in SIgA secretion rate were noted (before to after). Negative sensations to symptoms of stress at week 4 were inversely and significantly correlated with SIgA secretion rate. A positive and significant relationship between sources and symptoms of stress at week 4 and cortisol levels were verified. In summary, an approach incorporating in conjunction psychometric tools and salivary biomarkers could be an efficient means of monitoring reaction to stress in sport. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
Recent advances in energy technology generation and new directions in electricity regulation have made distributed generation (DG) more widespread, with consequent significant impacts on the operational characteristics of distribution networks. For this reason, new methods for identifying such impacts are needed, together with research and development of new tools and resources to maintain and facilitate continued expansion towards DG. This paper presents a study aimed at determining appropriate DG sites for distribution systems. The main considerations which determine DG sites are also presented, together with an account of the advantages gained from correct DG placement. The paper intends to define some quantitative and qualitative parameters evaluated by Digsilent (R), GARP3 (R) and DSA-GD software. A multi-objective approach based on the Bellman-Zadeh algorithm and fuzzy logic is used to determine appropriate DG sites. The study also aims to find acceptable DG locations both for distribution system feeders, as well as for nodes inside a given feeder. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Conventional threading operations involve two distinct machining processes: drilling and threading. Therefore, it is time consuming for the tools must be changed and the workpiece has to be moved to another machine. This paper presents an analysis of the combined process (drilling followed by threading) using a single tool for both operations: the tap-milling tool. Before presenting the methodology used to evaluate this hybrid tool, the ODS (operating deflection shapes) basics is shortly described. ODS and finite element modeling (FEM) were used during this research to optimize the process aiming to achieve higher stable machining conditions and increasing the tool life. Both methods allowed the determination of the natural frequencies and displacements of the machining center and optimize the workpiece fixture system. The results showed that there is an excellent correlation between the dynamic stability of the machining center-tool holder and the tool life, avoiding a tool premature catastrophic failure. Nevertheless, evidence showed that the tool is very sensitive to work conditions. Undoubtedly, the use of ODS and FEM eliminate empiric decisions concerning the optimization of machining conditions and increase drastically the tool life. After the ODS and FEM studies, it was possible to optimize the process and work material fixture system and machine more than 30,000 threaded holes without reaching the tool life limit and catastrophic fail.