969 resultados para open systems


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We return to the description of the damped harmonic oscillator with an assessment of previous works, in particular the Bateman-Caldirola-Kanai model and a new model proposed by one of the authors. We argue the latter has better high energy behavior and is connected to existing open-systems approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Teoria Econômica Emprega Dois Métodos: o Método Hipotético-Dedutivo, Utilizado Principalmente Pelos Economistas Neoclássicos, e o Método Histórico-Dedutivo, Adotado Pelos Economistas Clássicos e Keynesianos. Ambos são Legítimos, Mas, Desde que a Economia é Substantiva, não uma Ciência Metodológica, Onde o Objeto é o Sistema Econômico, o Método Histórico-Dedutivo é o Mais Apropriado. o Método Hipotético-Dedutivo Permite que o Economista Desenvolva Ferramentas para Analisar o Sistema Econômico, Mas Falha ao Analisar o Sistema como um Todo. em Contrapartida, o Método Histórico-Dedutivo Parte da Observação Empírica da Realidade e da Busca por Regularidades e Tendências. é um Método Empírico, Apropriado para as Ciências Substantivas que Tratam de Sistemas Abertos, como é o Caso da Economia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O objetivo da presente monografia é descreveruma experiência de planejamento governamental, mediante a utili zação do modelo de desenvolvimento institucional e enfocando o caso do Estado do Paraná. A escolha do tema decorreu, em grande parte, da existência de um sentimento generalizado nos meios técnicos oficiais, da inviabilidade do planejamento governamental a nível estadual no Brasil; sentimento motivado pela constat~ ção de uma excessiv'a centralização da formulação de políticas públicas e dos seus principais instrumentos implementadores pelo Governo Fedéral. E, .também~ da possibilidade de estar a experiência de planejamento vivida pelo Governo do Paraná, ~o período de 1975 a 1980, sendo conduzida de forma a consti tuir uma exceçao a esse sentimento. o desenvolvimento do estudo utilizou como refe rencial teórico o modelo formulado por Milton J.Esman e Hans C.Blaise, do Inter-University Research program in Institution Building - IRPIB, capitaneado pela Universidade de Pittsburgh. O,modelo fundamenta-se na teoria geral de sistemas e, conse qUentemente, visualiza as organizações como sistemas abertos, em contínuo relacionamento com o meio ambiente,pelo que as diversas variáveis exploradas pelo modelo concentram-se na caracterização de três elementos básicos: a organização, o meio ambiente e as transações. A monografia está dividida em cinco capítulos.No primeiro, uma introdução ao tema procura apontar a sua impor tância, os seus objetivos e a metodologia adotada. O segundo capítulo descreve o modelo teórico e sua utilização como um guia de pesquisa no campo organizacional e faz uma breve re visão da literatura mais relevante. No terceiro capítulo en contra-se uma resenha histórica da utilização do planejameg to governamental no Brasil, mais especificamente no Estadodo Paraná, que evolui até mostrar a configuração organizacio nal atual do sistema de planejamento estadual. O quarto ca pítulo refere-se ã aplicação do modelo teórico, através da identificação das variáveis institucionais (liderança.doutri na. programa. recursos e estrutura interna). dos elos insti tucionais-ambientais (elos capacitadores. normativos. funci~ nais e difusos) e das transações. o que permitiu a formula ção de considerações sobre o grau de institucionalidade. Fi nalmente. no quinto capítulo, estão formuladas as conclusões, com comentários gerais sobre o organismo objeto do estudo e suas perspectivas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta monografia é um estudo de desenvolvimento institucional que proporciona conhecer como uma organização da administração pública brasileira, criada com um fim especifico e transitório, sobreviveu, se transformou, criou raizes, expandiu sua área de influência e tornou-se uma instituição relativamente bem sucedida. Trata-se da Comissão Executiva do Plano da Lavoura Cacaueira - CEPLAC, órgão autônomo vinculado ao Ministério da Agricultura, criado na década de cinqüenta pelo Governo Federal, como solução de emergência para o apoio financeiro aos produtores de cacau do sul da Bahia, numa das maiores crises da economia cacaueira. O autor valeu-se, para sua dissertação, do quadro conceitual formulado por Milton J. Esman e Hans C. Blaise, da Inter-University Research Program in Institution Building. Este é um modelo que parte da premissa de que as instituições são sistema abertos e, como tal, mantêm padrões de relações e intercâmbio com o meio-ambiente em que operam. Importam energia do ambiente (inputs on entradas) e processam tais energias, transformando-as em produtos (outputs ou saldas) desejados e valorizados pelo ambiente. Dentro deste quadro, têm-se três elementos básicos: a organização; as transações; e o meio-ambiente. O trabalho foi dividido em cinco capítulos. No primeiro o autor discorreu sobre o tema e sua importância, objetivos do estudo, metodologia utilizada e o conteúdo da monografia. No segundo, fez uma discussão em torno do que é desenvolvimento institucional, sua evolução como estratégia de desenvolvimento; procedeu à revisão de literatura e descreveu o modelo que serviu de guia à pesquisa. Aí, registrou disfunções, limitações e restrições ao modelo de Esman e Blaise, e/ou aos efeitos da institucionalização, apontados por alguns estudiosos. No terceiro capítulo narrou os antecedentes, origem e evolução da CEPLAC, oportunidade em que identificou três fases da organização; no quarto, mostrou a CEPLAC através do modelo de desenvolvimento institucional, quando a organização foi analisada em função de suas variáveis institucionais (Liderança, Doutrina, Programa, Estrutura I nterna e Recursos), suas t ra n sações com o ambiente e suas variáveis institucionais-ambientais ou simplesmente elos institucionais. Finalmente, no qu into cap(tulo, alinhou as conclusões. A pesquisa objetivou atender aos questionamentos estabelecidos em cinco tópicos que buscavam investigar o seguinte: se houve um planejamento prévio, consciente e deliberado no sentido de transformar a CEPLAC de um órgão transitório, com atividade financeira, para uma instituição técnico-cientl'fica como é hoje. Em caso negativo, o que tornou isso poss(vel? Discorrer sobre a forma pela qual seu quadro dirigente foi estruturado e reuniu recursos, bem como que influências contribuíram para moldar seus programas; constatar como o quadro dirigente da CEPLAC manteve-se estável em um per(odo de mudanças no panorama político brasileiro; e especular sobre o futuro da institu ição. Entre outras coisas, o autor conclu iu que duas variáveis contribu iram particu larmente, de forma decisiva, para a sobrevivência, autonomia e desenvolvimento da instituição, com a obtenção de resultados valorizados pela sua clientela: 1) a existência de uma liderança institucional que soube definir o papel e a missão da CEPLAC, mantendo sua integridade ao longo do tempo; e, 2) a garantia de um fluxo sistemático de recursos financeiros através da contribuição cambial retirada das exportações de cacau. Além disso, constatou que três fatores foram importantes para sua estabilidade: o pleno enquadramento da instituição na filosofia modernizadora contida no bojo dos governos pós-54; a relevância de seus elos institucionais; e o reconhecimento da cl ientela e de setores governamentais importantes pela sua participação na evolução havida na economia cacaueira, de uma situação de crise para resultados satisfatórios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Actual trends in software development are pushing the need to face a multiplicity of diverse activities and interaction styles characterizing complex and distributed application domains, in such a way that the resulting dynamics exhibits some grade of order, i.e. in terms of evolution of the system and desired equilibrium. Autonomous agents and Multiagent Systems are argued in literature as one of the most immediate approaches for describing such a kind of challenges. Actually, agent research seems to converge towards the definition of renewed abstraction tools aimed at better capturing the new demands of open systems. Besides agents, which are assumed as autonomous entities purposing a series of design objectives, Multiagent Systems account new notions as first-class entities, aimed, above all, at modeling institutional/organizational entities, placed for normative regulation, interaction and teamwork management, as well as environmental entities, placed as resources to further support and regulate agent work. The starting point of this thesis is recognizing that both organizations and environments can be rooted in a unifying perspective. Whereas recent research in agent systems seems to account a set of diverse approaches to specifically face with at least one aspect within the above mentioned, this work aims at proposing a unifying approach where both agents and their organizations can be straightforwardly situated in properly designed working environments. In this line, this work pursues reconciliation of environments with sociality, social interaction with environment based interaction, environmental resources with organizational functionalities with the aim to smoothly integrate the various aspects of complex and situated organizations in a coherent programming approach. Rooted in Agents and Artifacts (A&A) meta-model, which has been recently introduced both in the context of agent oriented software engineering and programming, the thesis promotes the notion of Embodied Organizations, characterized by computational infrastructures attaining a seamless integration between agents, organizations and environmental entities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is well known that many realistic mathematical models of biological systems, such as cell growth, cellular development and differentiation, gene expression, gene regulatory networks, enzyme cascades, synaptic plasticity, aging and population growth need to include stochasticity. These systems are not isolated, but rather subject to intrinsic and extrinsic fluctuations, which leads to a quasi equilibrium state (homeostasis). The natural framework is provided by Markov processes and the Master equation (ME) describes the temporal evolution of the probability of each state, specified by the number of units of each species. The ME is a relevant tool for modeling realistic biological systems and allow also to explore the behavior of open systems. These systems may exhibit not only the classical thermodynamic equilibrium states but also the nonequilibrium steady states (NESS). This thesis deals with biological problems that can be treat with the Master equation and also with its thermodynamic consequences. It is organized into six chapters with four new scientific works, which are grouped in two parts: (1) Biological applications of the Master equation: deals with the stochastic properties of a toggle switch, involving a protein compound and a miRNA cluster, known to control the eukaryotic cell cycle and possibly involved in oncogenesis and with the propose of a one parameter family of master equations for the evolution of a population having the logistic equation as mean field limit. (2) Nonequilibrium thermodynamics in terms of the Master equation: where we study the dynamical role of chemical fluxes that characterize the NESS of a chemical network and we propose a one parameter parametrization of BCM learning, that was originally proposed to describe plasticity processes, to study the differences between systems in DB and NESS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Statistical analyses of temporal relationships between large earthquakes and volcanic eruptions suggest seismic waves may trigger eruptions even over great (>1000 km) distances, although the causative mechanism is not well constrained. In this study the relationship between large earthquakes and subtle changes in volcanic activity was investigated in order to gain greater insight into the relationship between dynamic stresses propagated by surface waves and volcanic response. Daily measurements from the Ozone Monitoring Instrument (OMI), onboard the Aura satellite, provide constraints on volcanic sulfur-dioxide (SO2) emission rates as a measure of subtle changes in activity. Time series of SO2 emission rates were produced from OMI data for thirteen persistently active volcanoes from 1 October 2004 to 30 September 2010. In order to quantify the affect of earthquakes at teleseismic distances, we modeled surface-wave amplitudes from the source mechanisms of moment magnitude (Mw) ≥7 earthquakes, and calculated the Peak Dynamic Stress (PDS). We assessed the influence of earthquakes on volcanic activity in two ways: 1) by identifying increases in the SO2 time series data and looking for causative earthquakes and 2) by examining the average emission rate before and after each earthquake. In the first, the SO2 time series for each volcano was used to calculate a baseline threshold for comparison with post-earthquake emission. Next, we generated a catalog of responses based on sustained SO2 emission increases above this baseline. Delay times between each SO2 response and each prior earthquake were analyzed using both the actual earthquake catalog, and a randomly generated catalog of earthquakes. This process was repeated for each volcano. Despite varying multiple parameters, this analysis did not demonstrate a clear relationship between earthquake-generated PDS and SO2 emission. However, the second analysis, which was based on the occurrence of large earthquakes indicated a response at most volcanoes. Using the PDS calculations as a filtering criterion for the earthquake catalog, the SO2 mass for each volcano was analyzed in 28-day windows centered on the earthquake origin time. If the average SO2 mass after the earthquake was greater than an arbitrary percentage of pre-earthquake mass, we identified the volcano as having a response to the event. This window analysis provided insight on what type of volcanic activity is more susceptible to triggering by dynamic stress. The volcanoes with very open systems included in this study, Ambrym, Gaua, Villarrica, Erta Ale and, Turrialba, showed a clear response to dynamic stress while the volcanoes with more closed systems, Merapi, Semeru, Fuego, Pacaya, and Bagana, showed no response.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Interannual environmental variability in Peru is dominated by the El Niño Southern Oscillation (ENSO). The most dramatic changes are associated with the warm El Niño (EN) phase (opposite the cold La Niña phase), which disrupts the normal coastal upwelling and affects the dynamics of many coastal marine and terrestrial resources. This study presents a trophic model for Sechura Bay, located at the northern extension of the Peruvian upwelling system, where ENSO-induced environmental variability is most extreme. Using an initial steady-state model for the year 1996, we explore the dynamics of the ecosystem through the year 2003 (including the strong EN of 1997/98 and the weaker EN of 2002/03). Based on support from literature, we force biomass of several non-trophically-mediated 'drivers' (e.g. Scallops, Benthic detritivores, Octopus, and Littoral fish) to observe whether the fit between historical and simulated changes (by the trophic model) is improved. The results indicate that the Sechura Bay Ecosystem is a relatively inefficient system from a community energetics point of view, likely due to the periodic perturbations of ENSO. A combination of high system productivity and low trophic level target species of invertebrates (i.e. scallops) and fish (i.e. anchoveta) results in high catches and an efficient fishery. The importance of environmental drivers is suggested, given the relatively small improvements in the fit of the simulation with the addition of trophic drivers on remaining functional groups' dynamics. An additional multivariate regression model is presented for the scallop Argopecten purpuratus, which demonstrates a significant correlation between both spawning stock size and riverine discharge-mediated mortality on catch levels. These results are discussed in the context of the appropriateness of trophodynamic modeling in relatively open systems, and how management strategies may be focused given the highly environmentally influenced marine resources of the region.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A numerical model of sulfate reduction and isotopic fractionation has been applied to pore fluid SO4**2- and d34S data from four sites drilled during Ocean Drilling Program (ODP) Leg 168 in the Cascadia Basin at 48°N, where basement temperatures reach up to 62°C. There is a source of sulfate both at the top and the bottom of the sediment column due to the presence of basement fluid flow, which promotes bacterial sulfate reduction below the sulfate minimum zone at elevated temperatures. Pore fluid d34S data show the highest values (135 per mil) yet found in the marine environment. The bacterial sulfur isotopic fractionation factor, a, is severely underestimated if the pore fluids of anoxic marine sediments are assumed to be closed systems and Rayleigh fractionation plots yield erroneous values for a by as much as 15 per mil in diffusive and advective pore fluid regimes. Model results are consistent with a = 1.077+/-0.007 with no temperature effect over the range 1.8 to 62°C and no effect of sulfate reduction rate over the range 2 to 10 pmol/ccm/day. The reason for this large isotopic fractionation is unknown, but one difference with previous studies is the very low sulfate reduction rates recorded, about two orders of magnitude lower than literature values that are in the range of µmol/ccm/day to tens of nmol/ccm/day. In general, the greatest 34S depletions are associated with the lowest sulfate reduction rates and vice versa, and it is possible that such extreme fractionation is a characteristic of open systems with low sulfate reduction rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A straightforward unprecedented sublimation protocol that reveals both conversion of a racemic compound into a racemic conglomerate and subsequent enantioenrichment has been developed for the proteinogenic amino acid valine. The phenomenon has been observed in closed and open systems, providing insight into asymmetric amplification mechanisms under presumably prebiotic conditions

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A contribution is presented, intended to provide theoretical foundations for the ongoing efforts to employ global instability theory for the analysis of the classic boundary-layer flow, and address the associated issue of appropriate inflow/outflow boundary conditions to close the PDE-based global eigenvalue problem in open flows. Starting from a theoretically clean and numerically simple application, in which results are also known analytically and thus serve as a guidance for the assessment of the performance of the numerical methods employed herein, a sequence of issues is systematically built into the target application, until we arrive at one representative of open systems whose instability is presently addressed by global linear theory applied to open flows, the latter application being neither tractable theoretically nor straightforward to solve by numerical means. Experience gained along the way is documented. It regards quantification of the depar- ture of the numerical solution from the analytical one in the simple problem, the generation of numerical boundary layers at artificially truncated boundaries, no matter how far the latter are placed from the region of highest flow gradients and, ultimately the impracti- cally large number of (direct and adjoint) modes necessary to project an arbitrary initial perturbation and follow its temporal evolution by a global analysis approach, a finding which may question the purported robustness reported in the literature of the recovery of optimal perturbations as part of global analyses yielding under-resolved eigenspectra.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta tesis se centra en desarrollo de tecnologías para la interacción hombre-robot en entornos nucleares de fusión. La problemática principal del sector de fusión nuclear radica en las condiciones ambientales tan extremas que hay en el interior del reactor, y la necesidad de que los equipos cumplan requisitos muy restrictivos para poder aguantar esos niveles de radiación, magnetismo, ultravacío, temperatura... Como no es viable la ejecución de tareas directamente por parte de humanos, habrá que utilizar dispositivos de manipulación remota para llevar a cabo los procesos de operación y mantenimiento. En las instalaciones de ITER es obligatorio tener un entorno controlado de extrema seguridad, que necesita de estándares validados. La definición y uso de protocolos es indispensable para regir su buen funcionamiento. Si nos centramos en la telemanipulación con algo grado de escalado, surge la necesidad de definir protocolos para sistemas abiertos que permitan la interacción entre equipos y dispositivos de diversa índole. En este contexto se plantea la definición del Protocolo de Teleoperación que permita la interconexión entre dispositivos maestros y esclavos de distinta tipología, pudiéndose comunicar bilateralmente entre sí y utilizar distintos algoritmos de control según la tarea a desempeñar. Este protocolo y su interconectividad se han puesto a prueba en la Plataforma Abierta de Teleoperación (P.A.T.) que se ha desarrollado e integrado en la ETSII UPM como una herramienta que permita probar, validar y realizar experimentos de telerrobótica. Actualmente, este Protocolo de Teleoperación se ha propuesto a través de AENOR al grupo ISO de Telerobotics como una solución válida al problema existente y se encuentra bajo revisión. Con el diseño de dicho protocolo se ha conseguido enlazar maestro y esclavo, sin embargo con los niveles de radiación tan altos que hay en ITER la electrónica del controlador no puede entrar dentro del tokamak. Por ello se propone que a través de una mínima electrónica convenientemente protegida se puedan multiplexar las señales de control que van a través del cableado umbilical desde el controlador hasta la base del robot. En este ejercicio teórico se demuestra la utilidad y viabilidad de utilizar este tipo de solución para reducir el volumen y peso del cableado umbilical en cifras aproximadas de un 90%, para ello hay que desarrollar una electrónica específica y con certificación RadHard para soportar los enormes niveles de radiación de ITER. Para este manipulador de tipo genérico y con ayuda de la Plataforma Abierta de Teleoperación, se ha desarrollado un algoritmo que mediante un sensor de fuerza/par y una IMU colocados en la muñeca del robot, y convenientemente protegidos ante la radiación, permiten calcular las fuerzas e inercias que produce la carga, esto es necesario para poder transmitirle al operador unas fuerzas escaladas, y que pueda sentir la carga que manipula, y no otras fuerzas que puedan influir en el esclavo remoto, como ocurre con otras técnicas de estimación de fuerzas. Como el blindaje de los sensores no debe ser grande ni pesado, habrá que destinar este tipo de tecnología a las tareas de mantenimiento de las paradas programadas de ITER, que es cuando los niveles de radiación están en sus valores mínimos. Por otro lado para que el operador sienta lo más fielmente posible la fuerza de carga se ha desarrollado una electrónica que mediante el control en corriente de los motores permita realizar un control en fuerza a partir de la caracterización de los motores del maestro. Además para aumentar la percepción del operador se han realizado unos experimentos que demuestran que al aplicar estímulos multimodales (visuales, auditivos y hápticos) aumenta su inmersión y el rendimiento en la consecución de la tarea puesto que influyen directamente en su capacidad de respuesta. Finalmente, y en referencia a la realimentación visual del operador, en ITER se trabaja con cámaras situadas en localizaciones estratégicas, si bien el humano cuando manipula objetos hace uso de su visión binocular cambiando constantemente el punto de vista adecuándose a las necesidades visuales de cada momento durante el desarrollo de la tarea. Por ello, se ha realizado una reconstrucción tridimensional del espacio de la tarea a partir de una cámara-sensor RGB-D, lo cual nos permite obtener un punto de vista binocular virtual móvil a partir de una cámara situada en un punto fijo que se puede proyectar en un dispositivo de visualización 3D para que el operador pueda variar el punto de vista estereoscópico según sus preferencias. La correcta integración de estas tecnologías para la interacción hombre-robot en la P.A.T. ha permitido validar mediante pruebas y experimentos para verificar su utilidad en la aplicación práctica de la telemanipulación con alto grado de escalado en entornos nucleares de fusión. Abstract This thesis focuses on developing technologies for human-robot interaction in nuclear fusion environments. The main problem of nuclear fusion sector resides in such extreme environmental conditions existing in the hot-cell, leading to very restrictive requirements for equipment in order to deal with these high levels of radiation, magnetism, ultravacuum, temperature... Since it is not feasible to carry out tasks directly by humans, we must use remote handling devices for accomplishing operation and maintenance processes. In ITER facilities it is mandatory to have a controlled environment of extreme safety and security with validated standards. The definition and use of protocols is essential to govern its operation. Focusing on Remote Handling with some degree of escalation, protocols must be defined for open systems to allow interaction among different kind of equipment and several multifunctional devices. In this context, a Teleoperation Protocol definition enables interconnection between master and slave devices from different typologies, being able to communicate bilaterally one each other and using different control algorithms depending on the task to perform. This protocol and its interconnectivity have been tested in the Teleoperation Open Platform (T.O.P.) that has been developed and integrated in the ETSII UPM as a tool to test, validate and conduct experiments in Telerobotics. Currently, this protocol has been proposed for Teleoperation through AENOR to the ISO Telerobotics group as a valid solution to the existing problem, and it is under review. Master and slave connection has been achieved with this protocol design, however with such high radiation levels in ITER, the controller electronics cannot enter inside the tokamak. Therefore it is proposed a multiplexed electronic board, that through suitable and RadHard protection processes, to transmit control signals through an umbilical cable from the controller to the robot base. In this theoretical exercise the utility and feasibility of using this type of solution reduce the volume and weight of the umbilical wiring approximate 90% less, although it is necessary to develop specific electronic hardware and validate in RadHard qualifications in order to handle huge levels of ITER radiation. Using generic manipulators does not allow to implement regular sensors for force feedback in ITER conditions. In this line of research, an algorithm to calculate the forces and inertia produced by the load has been developed using a force/torque sensor and IMU, both conveniently protected against radiation and placed on the robot wrist. Scaled forces should be transmitted to the operator, feeling load forces but not other undesirable forces in slave system as those resulting from other force estimation techniques. Since shielding of the sensors should not be large and heavy, it will be necessary to allocate this type of technology for programmed maintenance periods of ITER, when radiation levels are at their lowest levels. Moreover, the operator perception needs to feel load forces as accurate as possible, so some current control electronics were developed to perform a force control of master joint motors going through a correct motor characterization. In addition to increase the perception of the operator, some experiments were conducted to demonstrate applying multimodal stimuli (visual, auditory and haptic) increases immersion and performance in achieving the task since it is directly correlated with response time. Finally, referring to the visual feedback to the operator in ITER, it is usual to work with 2D cameras in strategic locations, while humans use binocular vision in direct object manipulation, constantly changing the point of view adapting it to the visual needs for performing manipulation during task procedures. In this line a three-dimensional reconstruction of non-structured scenarios has been developed using RGB-D sensor instead of cameras in the remote environment. Thus a mobile virtual binocular point of view could be generated from a camera at a fixed point, projecting stereoscopic images in 3D display device according to operator preferences. The successful integration of these technologies for human-robot interaction in the T.O.P., and validating them through tests and experiments, verify its usefulness in practical application of high scaling remote handling at nuclear fusion environments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La Casa Industrializada supone el ideal de realizar la casa unifamiliar a través de la potencia y los procedimientos de la industria. Como tal, la casa supone un producto industrial más sujeto a la lógica de la reproducción y del consumo. Como producto de consumo la casa debe establecerse como objeto de deseo, accesible al grupo de usuarios-consumidores al que va dirigido. El sueño de la Casa Industrializada se origina en la primera Revolución Industrial y se consolida en la segunda tras la producción del Ford T y la adhesión de los padres del movimiento moderno. A lo largo de su historia se han sucedido casos de éxito y fracaso, los primeros con la realización de un producto de imagen convencional y los segundos la mayor parte de las veces dirigidos por arquitectos. El sueño de la Casa Industrializada de la mano de arquitectos está comenzando a ser una realidad en Japón, Suecia y Estados Unidos a través de marcas como MUJI, Arkitekthus y Living Homes, pero aún dista de ser un hecho extendido en nuestra sociedad. Para que este ideal se cumpla deberá ofrecer valores que permita a la sociedad hacerlo suyo. La Tesis busca analizar la historia y la metodología de la Casa Industrializada, desde el diseño a la comercialización con el fin de ofrecer esos valores en forma de propuestas para la Casa Industrializada en este milenio. La casa como producto industrial-producto de consumo supera las lógicas tradicionales de la arquitectura para operar dentro del contexto de la producción industrial y la reproducción de los objetos. En este sentido es necesario establecer no solo la forma y construcción de la casa sino los mecanismos de reproducción con sus pertinentes eficiencias. La Casa industrializada no se construye, se monta, y para ello utiliza las estrategias de la construcción en seco, la prefabricación, el uso de componentes y los materiales ligeros. Desde la lógica del consumo, la casa debe dirigirse a un determinado público, no es más la casa para todos, característica de las situaciones de crisis y de emergencia. La casa se enfrenta a un mercado segmentado, tanto en cultura, como en deseos y poderes adquisitivos. En la cuestión del diseño debe plantearse más como diseño de producto que como diseño arquitectónico. La Casa Industrializada no es el fruto de un encargo y de una acción singular, debe ofrecerse lista para adquirir y para ser reproducida. Esta reproducción se puede dar tanto en la forma de modelos cerrados o sistemas abiertos que permitan la personalización por parte de los usuarios. Desde el ámbito cultural es necesario entender que la casa es más que una máquina de habitar, es un receptor de emociones, forma parte de nuestra memoria y nuestra cultura. La casa como producto social es una imagen de nosotros mismos, define la manera en la que nos situamos en el mundo y por tanto supone una definición de estatus. En esto, la Tesis se apoya en los textos de Baudrillard y su análisis de la sociedad de consumo y el papel de los objetos y su valor como signo. La Tesis realiza un repaso de los procedimientos industriales con especial énfasis en la producción automovilística y sitúa la evolución de la Casa Industrializada en relación a la evolución de los avances en los sistemas de producción industrial y las transferencias desde las industrias del automóvil y aeronáutica. La tesis se completa con una serie de casos de estudio que parten de las primeras casas de venta por correo de principios del siglo XX, pasando por las propuestas de Gropius, Fuller, el Case Study House Program, Prouvé, Sota y acaban con la situación actual. La Casa Industrializada ha mantenido una serie de valores a lo largo de su historia, como ideal, forma un cuerpo estable de propuestas que no se ha modificado a lo largo del tiempo. Con respecto a este nuevo milenio este ideal no debe ser cambiado sino simplemente actualizado y adaptado a los métodos de producción y las necesidades, sueños y exigencias de la sociedad de hoy. ABSTRACT The industrialized House provides an ideal to manufacture the house through the power and strategies of the industry. As such, the house becomes an industrial product that respond to the logic of reproduction and consumption. As a comodity, the house must become a desirable object, accessible to the group of the consumers to which is targeted The dream of the Industrialized home is originated in the First Industrial Revolution and it is consolidated in the second one after Ford´s production of Model T and the incorporation of the principal figures of the modern movement to the ideal of making houses at the factories. Throughout history there have been cases of success and failure, the first with the completion of a product of conventional image and the second most often led by architects. Industrialized dream house made by architects is starting to become a reality in Japan, Sweden and the United States through brands such as MUJI, Arkitekthus and Living Homes, but still far from beeing a widespread fact in our society. To fulfill this ideal, it should provide values that society could accept as of their own. The Thesis seeks to analyze the history and methodology of industrialized house, from design to marketing in order to offer these values in the form of proposals for industrialized house in this millennium. The house as an industrial-product-comodity extend beyond the traditional architectural logic to operate within the context of industrial production and the reproduction of objects. In this sense it is necessary to establish not only the shape and construction of the house but the mechanisms of reproduction with its relevant efficiencies. Industrialized house is not built it is assembled, and it uses the strategies of dry construction, prefabrication, using lightweight materials and components. From the logic of consumption, the house must go to a certain audience, it is no longer the home for all that is characteristic of crisis respond and emergency. The house faces a segmented market, both in culture and desires and purchasing power. On the question of design it must be considered more like product design than architectural design. Industrialized House is not the result of a commission and a singular action, it should be offered pret-a-porter and able to be reproduced. This reproduction can be given in form of closed or open systems models that allow its customization by users. From the cultural sphere is necessary to understand that the house is more than a machine for living, is a recipient of emotions, is part of our memory and our culture. The home as a social product is an image of ourselves, defines the way in which we place ourselves in the world and therefore represents a definition of status. In this aspect, the thesis is based on the texts of Baudrillard and his analysis of consumption society and the role of objects and its value as a sign in it. The thesis makes a review of the industrial processes with emphasis on automotive production and places the evolution of industrialized House in relation to the evolution of developments in industrial production systems and transfers from the automotive and aeronautics industries. The thesis is completed with a series of case studies that starts from the first mail order houses from the early twentieth century, going through the proposal of Gropius, Fuller, the Case Study House Program, Prouvé, Sota and end up with the current situation. Industrialized House has held a series of values throughout its history, as an ideal, forms a stable corps of proposals that has not changed over time. Regarding this new millennium this ideal should not be changed but simply be updated and adapted to production methods and needs, dreams and demands of today's society.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It was hypothesized that employees' perceptions of an organizational culture strong in human relations values and open systems values would be associated with heightened levels of readiness for change which, in turn, would be predictive of change implementation success. Similarly, it was predicted that reshaping capabilities would lead to change implementation success, via its effects on employees' perceptions of readiness for change. Using a temporal research design, these propositions were tested for 67 employees working in a state government department who were about to undergo the implementation of a new end-user computing system in their workplace. Change implementation success was operationalized as user satisfaction and system usage. There was evidence to suggest that employees who perceived strong human relations values in their division at Time 1 reported higher levels of readiness for change at pre-implementation which, in turn, predicted system usage at Time 2. In addition, readiness for change mediated the relationship between reshaping capabilities and system usage. Analyses also revealed that pre-implementation levels of readiness for change exerted a positive main effect on employees' satisfaction with the system's accuracy, user friendliness, and formatting functions at post-implementation. These findings are discussed in terms of their theoretical contribution to the readiness for change literature, and in relation to the practical importance of developing positive change attitudes among employees if change initiatives are to be successful.