910 resultados para Architecture for the physically handicapped
Resumo:
Flexibility of information systems (IS) have been studied to improve the adaption in support of the business agility as the set of capabilities to compete more effectively and adapt to rapid changes in market conditions (Glossary of business agility terms, 2003). However, most of work on IS flexibility has been limited to systems architecture, ignoring the analysis of interoperability as a part of flexibility from the requirements. This paper reports a PhD project, which proposes an approach to develop IS with flexibility features, considering some challenges of flexibility in small and medium enterprises (SMEs) such as the lack of interoperability and the agility of their business. The motivation of this research are the high prices of IS in developing countries and the usefulness of organizational semiotics to support the analysis of requirements for IS. (Liu, 2005).
Resumo:
In recent years, research into the impact of genetic abnormalities on cognitive development, including language, has become recognized for its potential to make valuable contributions to our understanding of the brain–behaviour relationships underlying language acquisition as well as to understanding the cognitive architecture of the human mind. The publication of Fodor’s ( 1983 ) book The Modularity of Mind has had a profound impact on the study of language and the cognitive architecture of the human mind. Its central claim is that many of the processes involved in comprehension are undertaken by special brain systems termed ‘modules’. This domain specificity of language or modularity has become a fundamental feature that differentiates competing theories and accounts of language acquisition (Fodor 1983 , 1985 ; Levy 1994 ; Karmiloff-Smith 1998 ). However, although the fact that the adult brain is modularized is hardly disputed, there are different views of how brain regions become specialized for specific functions. A question of some interest to theorists is whether the human brain is modularized from the outset (nativist view) or whether these distinct brain regions develop as a result of biological maturation and environmental input (neuroconstructivist view). One source of insight into these issues has been the study of developmental disorders, and in particular genetic syndromes, such as Williams syndrome (WS) and Down syndrome (DS). Because of their uneven profiles characterized by dissociations of different cognitive skills, these syndromes can help us address theoretically significant questions. Investigations into the linguistic and cognitive profiles of individuals with these genetic abnormalities have been used as evidence to advance theoretical views about innate modularity and the cognitive architecture of the human mind. The present chapter will be organized as follows. To begin, two different theoretical proposals in the modularity debate will be presented. Then studies of linguistic abilities in WS and in DS will be reviewed. Here, the emphasis will be mainly on WS due to the fact that theoretical debates have focused primarily on WS, there is a larger body of literature on WS, and DS subjects have typically been used for the purposes of comparison. Finally, the modularity debate will be revisited in light of the literature review of both WS and DS. Conclusions will be drawn regarding the contribution of these two genetic syndromes to the issue of cognitive modularity, and in particular innate modularity.
Resumo:
Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.
Resumo:
This chapter examines the importance of legitimacy for international organizations, and their efforts to legitimate themselves vis-à-vis different audiences. Legitimacy, which for decades barely featured in the scholarly analysis of international organizations, has since the late 1990s been an increasingly important lens through which the processes, practices, and structures of international organizations have been examined. The chapter makes three main arguments. First, it argues that in most international organizations the most important actors engaging in legitimation efforts are not the supranational bureaucracies, but member states. This has important implications for our understanding of the purposes of seeking legitimacy, and for the possible practices. Second, legitimacy and legitimation serve a range of purposes for these states, beyond achieving greater compliance with their decisions, which has been one of the key functional logics highlighted for legitimacy in the literature. Instead, legitimacy is frequently sought to exclude outsiders from the functional or territorial domains affected by an international organization’s authority, or to maintain external material and political support for existing arrangements. Third, one of the most prominent legitimation efforts, institutional reforms, often prioritizes form over function, signalling to important and powerful audiences to encourage their continued material and political support. To advance these arguments, the chapter is divided into four sections. The first develops the concept of legitimacy and its application to international organizations, and then asks why their legitimacy has become such an important intellectual and political concern in recent years. The second part will look in more detail at the legitimation practices of international organizations, focusing on who engages in these practices, who the key audiences are, and how legitimation claims are advanced. The third section will look in more detail at one of the most common forms of legitimation – institutional reform – through the lens of two such reforms in international organizations: efforts towards greater interoperability in NATO, and the establishment of the African Peace and Security Architecture in the African Union (AU). The chapter will conclude with some reflections of the contribution that a legitimacy perspective has made to our understanding of the practices of international organizations.
Resumo:
Impedance spectroscopy has been proven a powerful tool for reaching high sensitivity in sensor arrays made with nanostructured films in the so-called electronic tongue systems, whose distinguishing ability may be enhanced with sensing units capable of molecular recognition. In this study we show that for optimized sensors and bio-sensors the dielectric relaxation processes involved in impedance measurements should also be considered, in addition to an adequate choice of sensing materials. We used sensing units made from layer-by-layer (LbL) films with alternating layers of the polyeletrolytes, poly(allylamine) hydrochloride (PAH) and poly(vinyl sulfonate) (PVS), or LbL films of PAH alternated with layers of the enzyme phytase, all adsorbed on gold interdigitate electrodes. Surprisingly, the detection of phytic acid was as effective in the PVS/PAH sensing system as with the PAH/phytase system, in spite of the specific interactions of the latter. This was attributed to the dependence of the relaxation processes on nonspecific interactions such as electrostatic cross-linking and possibly on the distinct film architecture as the phytase layers were found to grow as columns on the LbL film, in contrast to the molecularly thin PAH/PVS films. Using projection techniques, we were able to detect phytic acid at the micromolar level with either of the sensing units in a data analysis procedure that allows for further optimization.
Resumo:
An important feature of a database management systems (DBMS) is its client/server architecture, where managing shared memory among the clients and the server is always an tough issue. However, similarity queries are specially sensitive to this kind of architecture, since the answer sizes vary widely. Usually, the answers of similarity query are fully processed to be sent in full to the user, who often is interested in just parts of the answer, e.g. just few elements closer or farther to the query reference. Compelling the DBMS to retrieve the full answer, further ignoring its majority is at least a waste of server processing power. Paging the answer is a technique that splits the answer onto several pages, following client requests. Despite the success of paging on traditional queries, little work has been done to support it in similarity queries. In this work, we present a technique that not only provides paging in similarity range or k-nearest neighbor queries, but also supports them in two variations: the forward similarity query and the backward similarity query. They return elements either increasingly farther of increasingly closer to the query reference. The reported experiments show that, depending on the proportion of the interesting part over the full answer, both techniques allow answering queries much faster than it is obtained in the non-paged way. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Localization and Mapping are two of the most important capabilities for autonomous mobile robots and have been receiving considerable attention from the scientific computing community over the last 10 years. One of the most efficient methods to address these problems is based on the use of the Extended Kalman Filter (EKF). The EKF simultaneously estimates a model of the environment (map) and the position of the robot based on odometric and exteroceptive sensor information. As this algorithm demands a considerable amount of computation, it is usually executed on high end PCs coupled to the robot. In this work we present an FPGA-based architecture for the EKF algorithm that is capable of processing two-dimensional maps containing up to 1.8 k features at real time (14 Hz), a three-fold improvement over a Pentium M 1.6 GHz, and a 13-fold improvement over an ARM920T 200 MHz. The proposed architecture also consumes only 1.3% of the Pentium and 12.3% of the ARM energy per feature.
Resumo:
The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This work presents the electro-optical characterization of metal-organic interfaces prepared by the Ion Beam Assisted Deposition (IBAD) method. IBAD applied in this work combines simultaneously metallic film deposition and bombardment with an independently controlled ion beam, allowing different penetration of the ions and the evaporated metallic elements into the polymer. The result is a hybrid, non-abrupt interface, where polymer, metal and ion coexists. We used an organic light emitting diode, which has a typical vertical-architecture, for the interface characterization: Glass/Indium Tin Oxide (ITO)/Poly[ethylene-dioxythiophene/poly{styrenesulfonicacid}]) (PEDOT:PSS) /Emitting Polymer/Metal. The emitting polymer layer comprised of the Poly[(9,9-dioctyl-2,7-divinylenefluorenylene)-alt-co-{2-methoxy-5-(2-ethylhexyloxy)-1,4-phenylene}] (PFO) and the metal layer of aluminum prepared with different Ar(+) ion energies varying in the range from 0 to 1000 eV. Photoluminescence, Current-Voltage and Electroluminescence measurements were used to study the emission and electron injection properties. Changes of these properties were related with the damage caused by the energetic ions and the metal penetration into the polymer. Computer simulations of hybrid interface damage and metal penetration were confronted with experimental data. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper presents the use of a multiprocessor architecture for the performance improvement of tomographic image reconstruction. Image reconstruction in computed tomography (CT) is an intensive task for single-processor systems. We investigate the filtered image reconstruction suitability based on DSPs organized for parallel processing and its comparison with the Message Passing Interface (MPI) library. The experimental results show that the speedups observed for both platforms were increased in the same direction of the image resolution. In addition, the execution time to communication time ratios (Rt/Rc) as a function of the sample size have shown a narrow variation for the DSP platform in comparison with the MPI platform, which indicates its better performance for parallel image reconstruction.
Resumo:
This thesis presents the study and development of fault-tolerant techniques for programmable architectures, the well-known Field Programmable Gate Arrays (FPGAs), customizable by SRAM. FPGAs are becoming more valuable for space applications because of the high density, high performance, reduced development cost and re-programmability. In particular, SRAM-based FPGAs are very valuable for remote missions because of the possibility of being reprogrammed by the user as many times as necessary in a very short period. SRAM-based FPGA and micro-controllers represent a wide range of components in space applications, and as a result will be the focus of this work, more specifically the Virtex® family from Xilinx and the architecture of the 8051 micro-controller from Intel. The Triple Modular Redundancy (TMR) with voters is a common high-level technique to protect ASICs against single event upset (SEU) and it can also be applied to FPGAs. The TMR technique was first tested in the Virtex® FPGA architecture by using a small design based on counters. Faults were injected in all sensitive parts of the FPGA and a detailed analysis of the effect of a fault in a TMR design synthesized in the Virtex® platform was performed. Results from fault injection and from a radiation ground test facility showed the efficiency of the TMR for the related case study circuit. Although TMR has showed a high reliability, this technique presents some limitations, such as area overhead, three times more input and output pins and, consequently, a significant increase in power dissipation. Aiming to reduce TMR costs and improve reliability, an innovative high-level technique for designing fault-tolerant systems in SRAM-based FPGAs was developed, without modification in the FPGA architecture. This technique combines time and hardware redundancy to reduce overhead and to ensure reliability. It is based on duplication with comparison and concurrent error detection. The new technique proposed in this work was specifically developed for FPGAs to cope with transient faults in the user combinational and sequential logic, while also reducing pin count, area and power dissipation. The methodology was validated by fault injection experiments in an emulation board. The thesis presents comparison results in fault coverage, area and performance between the discussed techniques.
Resumo:
Este trabalho apresenta o estudo das dimensões organizacionais mais relevantes na arquitetura organizacional das cooperativas de trabalho médico. À luz do arcabouço teórico de alguns autores especialistas em organizações e cooperativas, buscou-se compreender, discutir e analisar, quais as possibilidades e a aplicabilidade de tais pressupostos. Fez-se uma retrospectiva histórica sobre o cooperativismo e, em especial o cooperativismo de trabalho médico, com o objetivo de identificar em quais bases foram construídas e quais as necessidades a serem satisfeitas desde a sua criação. Buscamos dentro da bibliografia disponível sobre o assunto, identificar em cada autor a sua visão sobre a problemática da gestão em cooperativas e os possíveis apontamentos para as praticas gerenciais. Constatamos que existe uma convergência de opiniões dos diversos autores sobre o tema, principalmente quanto à profissionalização da gestão nas cooperativas e, 'a falta de um modelo único capaz de cotejar as nuances de cada organização. Concluiu-se que, pela complexidade dos empreendimentos cooperativos e pelas exigências do mercado, não é mais possível que as cooperativas mantenham dirigentes amadores em seu corpo diretivo. Constatou-se também não ser possível propor um esboço sequer de uma estrutura organizacional, pela complexidade e pelo alto grau de especificidade das cooperativas atualmente.
Resumo:
The Information Technology (IT) is a concept which has gained importance for organizations. It is expected that the strategic use of IT not only sustain the business operations of enterprises, but mainly leverage the initiative of new competitive strategies. However, these expectations on the earnings with the IT not have been achieved and questions arise about the return of the investments in IT. One of the causes is credited to the lack of alignment between the strategies of business and IT. The search of strategic alignment between IT and business takes to the necessity of measure it. This assessment can help identify whether the perceptions of business executives and IT executives, about the strategic alignment of IT, are similar or different. The objective of this work is to investigate the perceptions of business executives and IT executives in relation to the IT strategic alignment implemented in a selected organization. It was conducted a case study, in a company that provides services to the financial market. As a result, this work identified that there is no statistically significant difference between the perceptions of business executives and IT executives, related to the level of IT strategic alignment maturity implemented in the organization, and highlighted factors that promote this alignment: (a) senior management supports the IT (b) IT takes part of strategic planning, (c) IT understands the business of the company, and (d) there is a partnership between business and IT executives. Additionally, it was proposed that these similar perceptions result from the sharing of assumptions, knowledge and common expectations for the IT strategic alignment between the two groups of executives interviewed, and that led the company to achieve a higher level of IT strategic alignment. Each Practice of Strategic Alignment was examined separately. Although not have statistically significant differences between the perceptions of business executives and IT executives, the practices of Communication, Measures of Value and Competence, and Skills were better assessed by business executives and the practices of Governance and Partnerships have been better perceived by IT executives. The practice of Scope and Architecture and the IT Strategic Alignment, showed no differences in perceptions between the two groups of executives.
Resumo:
O presente trabalho tem o propósito de analisar o processo de fidelização do consumidor corporativo no mercado de comunicação móvel celular. Serão investigadas as práticas comerciais da operadora de telefonia móvel celular Vivo, sob a ótica das teorias propostas. Para que esta análise ocorra será realizada uma pesquisa quantitativa com 120 empresas, divididas entre clientes e ex-clientes, entrevistas em profundidade com 08 empresas, também dividas entre clientes e ex-clientes e entrevistas em profundidade com os executivos responsáveis pela arquitetura das estratégias comerciais. Baseado no resultado da pesquisa, das entrevistas e da análise das teorias propostas, este estudo pretende apontar as práticas de marketing que podem gerar lealdade nos clientes empresariais de pequeno e médio porte no mercado de telefonia móvel celular.
Resumo:
O mercado de arquitetura atende a um nicho específico de pessoas de poder econômico e gosto, ou seja, é uma atividade de luxo, que conta com uma concorrência acirrada. O presente trabalho tenta entender como alguns escritórios de arquitetura, mesmo diante de tais barreiras, conseguem se sobressair perante outros. O projeto tenta identificar quais os recursos e capacidades mais relevantes dentro desses escritórios de arquitetura de sucesso. O estudo está centrado na teoria dos recursos (RBV) e nos conceitos relativos a empresas de serviços profissionais. A metodologia utilizada foi qualitativa, através de estudo exploratório com coleta de dados realizada por meio de entrevista semiestruturada, seguida de uma análise comparativa. Os escritórios de arquitetura foram selecionados por um painel com os critérios de sucesso, elaborado a partir de entrevistas com especialistas do mercado de arquitetura. Os resultados mostraram a relação entre a dimensão dos escritórios escolhidos (número de arquitetos, de projetos em andamento e de obras concluídas) e os recursos e as capacidades encontrados (o papel da liderança, a centralização da criação, a prospecção, a gestão do escritório e a terceirização).