881 resultados para knowing-what (pattern recognition) element of knowing-how knowledge
Resumo:
There is a tremendous amount of mystery that surrounds the instruments of Antonio Stradivari. There have been many studies done in the past, but no one completely understands exactly how he made his instruments, or why they are still considered the best in the world. This project is designed to develop an engineering model of one of Stradivari's violins that will accurately simulate the structural and acoustic behavior of the instrument. It also hopes to shine some light on what makes the instruments of Stradivari unique when compared to other violins. It will focus on geometry and material properties, utilizing several modern engineering tools, including CT scanning, experimental modal analysis, finite element analysis, correlation techniques, and acoustic synthesis.
Resumo:
Yearling steers were sorted into four groups based on hip height and fat cover at the start of the finishing period. Each group of sorted steers was fed diets containing 0.59 or 0.64 Mcal NEg per pound of diet. The value of each carcass was determined by use of the Oklahoma State University Boxed Beef Calculator. Sorting to increase hip height decreased the percentage of Choice carcasses and fat cover, increased ribeye area, and had no effect on carcass weight or yield grades 1 and 2. Sorting to decrease initial fat cover decreased carcass weight, carcass fat cover, and percentage of choice carcasses and increased the proportion of yield grades 1 and 2 carcasses. Concentration of energy in the finishing diet had no effect on carcass measurements. Increasing the percentage of yield grades 1 and 2 carcasses did not result in increased economic value of the carcasses when quality grades were lower and when there was a wide spread between Choice and Select carcasses, as occurred in 1996. With less spread between Choice and Select, as in 1997, sorting the cattle to increase yield grades 1 and 2 resulted in increased value, especially for close-trim boxed beef. The results of this study emphasize the importance of knowing how carcasses will grade before selecting a valuebased market for selling cattle.
Resumo:
Climate change alone influences future levels of tropospheric ozone and their precursors through modifications of gas-phase chemistry, transport, removal, and natural emissions. The goal of this study is to determine at what extent the modes of variability of gas-phase pollutants respond to different climate change scenarios over Europe. The methodology includes the use of the regional modeling system MM5 (regional climate model version)-CHIMERE for a target domain covering Europe. Two full-transient simulations covering from 1991–2050 under the SRES A2 and B2 scenarios driven by ECHO-G global circulation model have been compared. The results indicate that the spatial patterns of variability for tropospheric ozone are similar for both scenarios, but the magnitude of the change signal significantly differs for A2 and B2. The 1991–2050 simulations share common characteristics for their chemical behavior. As observed from the NO2 and α-pinene modes of variability, our simulations suggest that the enhanced ozone chemical activity is driven by a number of parameters, such as the warming-induced increase in biogenic emissions and, to a lesser extent, by the variation in nitrogen dioxide levels. For gas-phase pollutants, the general increasing trend for ozone found under A2 and B2 forcing is due to a multiplicity of climate factors, such as increased temperature, decreased wet removal associated with an overall decrease of precipitation in southern Europe, increased photolysis of primary and secondary pollutants as a consequence of lower cloudiness and increased biogenic emissions fueled by higher temperatures.
Resumo:
Background Our knowledge of factors influencing mortality of patients with pelvic ring injuries and the impact of associated injuries is currently based on limited information. Questions/purposes Weidentified the (1) causes and time of death, (2) demography, and (3) pattern and severity of injuries in patients with pelvic ring fractures who did not survive. Methods We prospectively collected data on 5340 patients listed in the German Pelvic Trauma Registry between April 30, 2004 and July 29, 2011; 3034 of 5340 (57%) patientswere female. Demographic data and parameters indicating the type and severity of injury were recorded for patients who died in hospital (nonsurvivors) and compared with data of patients who survived (survivors). The median followup was 13 days (range, 0–1117 days). Results A total of 238 (4%) patients died a median of 2 days after trauma. The main cause of death was massive bleeding (34%), predominantly from the pelvic region (62% of all patients who died because of massive bleeding). Fifty six percent of nonsurvivors and 43% of survivors were male. Nonsurvivors were characterized by a higher incidence of complex pelvic injuries (32% versus 8%), less isolated pelvic ring fractures (13% versus 49%), lower initial blood hemoglobin concentration (6.7 ± 2.9 versus 9.8 ± 3.0 g/dL) and systolic arterial blood pressure (77 ± 27 versus 106 ± 24 mmHg), and higher injury severity score (ISS) (35 ± 16 versus 15 ± 12). Conclusion Patients with pelvic fractures who did not survive were characterized by male gender, severe multiple trauma, and major hemorrhage.
Resumo:
Cultural protectionism has been an element of national and foreign policies, as an extension of state sovereignty expressed both in a defensive and offensive manner. While the generic protectionist formula in the sense of restraining trade between states through measures such as import tariffs or quotas and through privileging domestic production has somewhat disintegrated over time under the rationale for free trade and the strong practical evidence of its benefits, the particular case of cultural protectionism has persevered. As we reveal in this paper, however, it has been modified, or at least its rhetoric has changed. The enquiry into the notion of cultural protectionism or cultural diversity, as the current political jargon would have it, is but one of the paper’s objectives. Its second and certainly more ambitious goal is the search for the normative dimensions of cultural diversity policies in the global digital space, asking what adjustments are needed and how feasible the entire project of diversity regulation in this environment may be. Taking into account the specificities of cyberspace and in a forward-looking manner, we propose some adjustments to current media policy practices that could better serve the goal of a sustainably diverse cultural environment.
Resumo:
The policy development process leading to the Labour government's white paper of December 1997—The new NHS: Modern, Dependable—is the focus of this project and the public policy development literature is used to aid in the understanding of this process. Policy makers who had been involved in the development of the white paper were interviewed in order to acquire a thorough understanding of who was involved in this process and how they produced the white paper. A theoretical framework is used that sorts policy development models into those that focus on knowledge and experience, and those which focus on politics and influence. This framework is central to understanding the evidence gathered from the individuals and associations that participated in this policy development process. The main research question to be asked in this project is to what extent do either of these sets of policy development models aid in understanding and explicating the process by which the Labour government's policies were developed. The interview evidence, along with published evidence, show that a clear pattern of policy change emerged from this policy development process, and the Knowledge-Experience and Politics-Influence policy making models both assist in understanding this process. The early stages of the policy development process were characterized as hierarchical and iterative, yet also very collaborative among those participating, with knowledge and experience being quite prevalent. At every point in the process, however, informal networks of political influence were used and noted to be quite prevalent by all of the individuals interviewed. The later stages of the process then became increasingly noninclusive, with decisions made by a select group of internal and external policy makers. These policy making models became an important tool with which to understand the policy development process. This Knowledge-Experience and Politics-Influence dichotomy of policy development models could therefore be useful in analyzing other types of policy development. ^
Resumo:
Increasing attention has been given to the problem of medical errors over the past decade. Included within that focused attention has been a strong interest in reducing the occurrence of healthcare-associated infections (HAIs). Acting concurrently with federal initiatives, the majority of U.S. states have statutorily required reporting and public disclosure of HAI data. Although the occurrence of these state statutory enactments and other state initiatives represent a recognition of the strong concern pertaining to HAIs, vast differences in each state’s HAI reporting and public disclosure requirements creates a varied and unequal response to what has become a national problem.^ The purpose of this research was to explore the variations in state HAI legal requirements and other state mandates. State actions, including statutory enactments, regulations, and other initiatives related to state reporting and public disclosure mechanisms were compared, discussed, and analyzed in an effort to illustrate the impact of the lack of uniformity as a public health concern.^ The HAI statutes, administrative requirements, and other mandates of each state and two U.S. territories were reviewed to answer the following seven research questions: How far has the state progressed in its HAI initiative? If the state has a HAI reporting requirement, is it mandatory or voluntary? What healthcare entities are subject to the reporting requirements? What data collection system is utilized? What measures are required to be reported? What is the public disclosure mechanism? How is the underlying reported information protected from public disclosure or other legal release?^ Secondary publicly available data, including state statutes, administrative rules, and other initiatives, were utilized to examine the current HAI-related legislative and administrative activity of the study subjects. The information was reviewed and analyzed to determine variations in HAI reporting and public disclosure laws. Particular attention was given to the seven key research questions.^ The research revealed that considerable progress has been achieved in state HAI initiatives since 2004. Despite this progress, however, when reviewing the state laws and HAI programs comparatively, considerable variations were found to exist with regards to the type of reporting requirements, healthcare facilities subject to the reporting laws, data collection systems utilized, reportable measures, public disclosure requirements, and confidentiality and privilege provisions. The wide variations in state statutes, administrative rules, and other agency directives create a fragmented and inconsistent approach to addressing the nationwide occurrence of HAIs in the U.S. healthcare system. ^
Resumo:
TNF-α is a pleiotropic cytokine involved in normal homeostasis and plays a key role in defending the host from infection and malignancy. However when deregulated, TNF-α can lead to various disease states. Therefore, understanding the mechanisms by which TNF-α is regulated may aid in its control. In spite of the knowledge gained regarding the transcriptional regulation of TNF-α further characterization of specific TNF-α promoter elements remains to be elucidated. In particular, the T&barbelow;NF-α A&barbelow;P-1/C&barbelow;RE-like (TAC) element of the TNF-α promoter has been shown to be important in the regulation of TNF-α in lymphocytes. Activating transcription factor-2 (ATF-2) and c-Jun were shown to bind to and transactivate the TAC element However, the role of TAC and transcription factors ATF-2 and c-Jun in the regulation of TNF-α in monocytes is not as well characterized. Lipopolysaccharide (LPS), a potent activator of TNF-α in monocytes, provides a good model to study the involvement of TAC in TNF-α regulation. On the other hand, all-tram retinoic acid (ATRA), a physiological monocyte-differentiation agent, is unable to induce TNF-α protein release. ^ To delineate the functional role of TAC, we transfected the wildtype or the TAC deleted TNF-α promoter-CAT construct into THP-1 promonocytic cells before stimulating them with LPS. CAT activity was induced 17-fold with the wildtype TNF-α promoter, whereas the CAT activity was uninducible when the TAC deletion mutant was used. This daft suggests that TAC is vital for LPS to activate the TNF-α promoter. Electrophoretic mobility shift assays using the TAC element as a probe showed a unique pattern for LPS-activated cells: the disappearance of the upper band of a doublet seen in untreated and ATRA treated cells. Supershift analysis identified c-Jun and ATF-2 as components of the LPS-stimulated binding complex. Transient transfection studies using dominant negative mutants of JNK, c-Jun, or ATF-2 suggest that these proteins we important for LPS to activate the TNF-α promoter. Furthermore, an increase in phosphorylated or activated c-Jun was bound to the TAC element in LPS-stimulated cells. Increased c-Jun activation was correlated with increased activity of Jun N-terminal kinase (JNK), a known upstream stimulator of c-Jun and ATF-2, in LPS-stimulated monocytes. On the other hand, ATRA did not induce TNF-α protein release nor changes in the phosphorylation of c-Jun or JNK activity, suggesting that pathways leading to ATRA differentiation of monocytic cells are independent of TNF-α activation. Together, the induction of TNF-α gene expression seems to require JNK activation, and activated c-Jun binding to the TAC element of the TNF-α promoter in THP-1 promonocytic cells. ^
Resumo:
EI artículo señala la importancia que, junto con la innovación, adquirió el capital social en la nueva agenda de desarrollo. Frente al contexto globalizado, la integración productiva se ha convertido en una estrategia viable para responder a las nuevas demandas de escala y competitividad. Por ello, distintos organismos públicos y privados en América Latina ven la conveniencia de generar políticas y programas tendientes a fortalecer la asociatividad. En este ámbito el Estado asume un rol fundamental. Más allá de los resultados técnicos y económicos alcanzados, saber como y hasta donde se puede conseguir la integración entre agentes diferentes respecto a la disponibilidad de recursos e intereses es una necesidad, cuando lo que interesa es procurar el desarrollo endógeno.
Resumo:
After the extraordinary spread of the World Wide Web during the last fifteen years, engineers and developers are pushing now the Internet to its next border. A new conception in computer science and networks communication has been burgeoning during roughly the last decade: a world where most of the computers of the future will be extremely downsized, to the point that they will look like dust at its most advanced prototypes. In this vision, every single element of our “real” world has an intelligent tag that carries all their relevant data, effectively mapping the “real” world into a “virtual” one, where all the electronically augmented objects are present, can interact among them and influence with their behaviour that of the other objects, or even the behaviour of a final human user. This is the vision of the Internet of the Future, which also draws ideas of several novel tendencies in computer science and networking, as pervasive computing and the Internet of Things. As it has happened before, materializing a new paradigm that changes the way entities interrelate in this new environment has proved to be a goal full of challenges in the way. Right now the situation is exciting, with a plethora of new developments, proposals and models sprouting every time, often in an uncoordinated, decentralised manner away from any standardization, resembling somehow the status quo of the first developments of advanced computer networking, back in the 60s and the 70s. Usually, a system designed after the Internet of the Future will consist of one or several final user devices attached to these final users, a network –often a Wireless Sensor Network- charged with the task of collecting data for the final user devices, and sometimes a base station sending the data for its further processing to less hardware-constrained computers. When implementing a system designed with the Internet of the Future as a pattern, issues, and more specifically, limitations, that must be faced are numerous: lack of standards for platforms and protocols, processing bottlenecks, low battery lifetime, etc. One of the main objectives of this project is presenting a functional model of how a system based on the paradigms linked to the Internet of the Future works, overcoming some of the difficulties that can be expected and showing a model for a middleware architecture specifically designed for a pervasive, ubiquitous system. This Final Degree Dissertation is divided into several parts. Beginning with an Introduction to the main topics and concepts of this new model, a State of the Art is offered so as to provide a technological background. After that, an example of a semantic and service-oriented middleware is shown; later, a system built by means of this semantic and service-oriented middleware, and other components, is developed, justifying its placement in a particular scenario, describing it and analysing the data obtained from it. Finally, the conclusions inferred from this system and future works that would be good to be tackled are mentioned as well. RESUMEN Tras el extraordinario desarrollo de la Web durante los últimos quince años, ingenieros y desarrolladores empujan Internet hacia su siguiente frontera. Una nueva concepción en la computación y la comunicación a través de las redes ha estado floreciendo durante la última década; un mundo donde la mayoría de los ordenadores del futuro serán extremadamente reducidas de tamaño, hasta el punto que parecerán polvo en sus más avanzado prototipos. En esta visión, cada uno de los elementos de nuestro mundo “real” tiene una etiqueta inteligente que porta sus datos relevantes, mapeando de manera efectiva el mundo “real” en uno “virtual”, donde todos los objetos electrónicamente aumentados están presentes, pueden interactuar entre ellos e influenciar con su comportamiento el de los otros, o incluso el comportamiento del usuario final humano. Ésta es la visión del Internet del Futuro, que también toma ideas de varias tendencias nuevas en las ciencias de la computación y las redes de ordenadores, como la computación omnipresente y el Internet de las Cosas. Como ha sucedido antes, materializar un nuevo paradigma que cambia la manera en que las entidades se interrelacionan en este nuevo entorno ha demostrado ser una meta llena de retos en el camino. Ahora mismo la situación es emocionante, con una plétora de nuevos desarrollos, propuestas y modelos brotando todo el rato, a menudo de una manera descoordinada y descentralizada lejos de cualquier estandarización, recordando de alguna manera el estado de cosas de los primeros desarrollos de redes de ordenadores avanzadas, allá por los años 60 y 70. Normalmente, un sistema diseñado con el Internet del futuro como modelo consistirá en uno o varios dispositivos para usuario final sujetos a estos usuarios finales, una red –a menudo, una red de sensores inalámbricos- encargada de recolectar datos para los dispositivos de usuario final, y a veces una estación base enviando los datos para su consiguiente procesado en ordenadores menos limitados en hardware. Al implementar un sistema diseñado con el Internet del futuro como patrón, los problemas, y más específicamente, las limitaciones que deben enfrentarse son numerosas: falta de estándares para plataformas y protocolos, cuellos de botella en el procesado, bajo tiempo de vida de las baterías, etc. Uno de los principales objetivos de este Proyecto Fin de Carrera es presentar un modelo funcional de cómo trabaja un sistema basado en los paradigmas relacionados al Internet del futuro, superando algunas de las dificultades que pueden esperarse y mostrando un modelo de una arquitectura middleware específicamente diseñado para un sistema omnipresente y ubicuo. Este Proyecto Fin de Carrera está dividido en varias partes. Empezando por una introducción a los principales temas y conceptos de este modelo, un estado del arte es ofrecido para proveer un trasfondo tecnológico. Después de eso, se muestra un ejemplo de middleware semántico orientado a servicios; después, se desarrolla un sistema construido por medio de este middleware semántico orientado a servicios, justificando su localización en un escenario particular, describiéndolo y analizando los datos obtenidos de él. Finalmente, las conclusiones extraídas de este sistema y las futuras tareas que sería bueno tratar también son mencionadas.
Resumo:
El peso específico de las Comunicaciones Ópticas dentro del ámbito de la Ingeniería de Telecomunicación no cesa de crecer. Sus aplicaciones, inicialmente dedicadas a las grandes líneas que enlazan las centrales de conmutación, alcanzan en la actualidad, como se ha mencionado, hasta los mismos hogares. Los progresos en este campo, con una sucesión sin tregua, no sólo se destinan a incrementar la capacidad de transmisión de los sistemas, sino a ampliar la diversidad de los procesos que sobre las señales se efectúan en el dominio óptico. Este dinamismo demanda a los profesionales del sector una revisión y actualización de sus conocimientos que les permitan resolver con soltura las cuestiones de su actividad de ingeniería. Por otra parte, durante los últimos años la importancia de las Comunicaciones Ópticas también se ha reflejado en las diferentes titulaciones de Ingenierías de Telecomunicación, cuyos planes de estudio contemplan esta materia tanto en asignaturas troncales como optativas. A menudo, las fuentes de información disponibles abordan esta disciplina con una orientación principalmente teórica. Profesionales y estudiantes de Ingeniería, pues, frente a esta materia se encuentran unos temas que tratan fenómenos físicos complejos, abundantes en conceptos abstractos y con un florido aparato matemático, pero muchas veces carentes de una visión práctica, importantísima en ingeniería, y que es, en definitiva, lo que se exige a alumnos e ingenieros: saber resolver problemas y cuestiones relacionados con las Comunicaciones Ópticas. Los sistemas de comunicaciones ópticas, y en especial aquellos que utilizan la fibra óptica como medio para la transmisión de información, como se ha dicho, están alcanzando un desarrollo importante en el campo de las telecomunicaciones. Las bondades que ofrece la fibra, de sobra conocidos y mencionados en el apartado que antecede (gran ancho de banda, inmunidad total a las perturbaciones de origen electromagnético, así como la no producción de interferencias, baja atenuación, etc.), han hecho que, hoy en día, sea uno de los campos de las llamadas tecnologías de la información y la comunicación que presente mayor interés por parte de científicos, ingenieros, operadores de telecomunicaciones y, por supuesto, usuarios. Ante esta realidad, el objetivo y justificación de la realización de este proyecto, por tanto, no es otro que el de acercar esta tecnología al futuro ingeniero de telecomunicaciones, y/o a cualquier persona con un mínimo de interés en este tema, y mostrarle de una forma práctica y visual los diferentes fenómenos que tienen lugar en la transmisión de información por medio de fibra óptica, así como los diferentes bloques y dispositivos en que se divide dicha comunicación. Para conseguir tal objetivo, el proyecto fin de carrera aquí presentado tiene como misión el desarrollo de una interfaz gráfica de usuario (GUI, del inglés Graphic User Interface) que permita a aquel que la utilice configurar de manera sencilla cada uno de los bloques en que se compone un enlace punto a punto de fibra óptica. Cada bloque en que se divide este enlace estará compuesto por varias opciones, que al elegir y configurar como se quiera, hará variar el comportamiento del sistema y presentará al usuario los diferentes fenómenos presentes en un sistema de comunicaciones ópticas, como son el ruido, la dispersión, la atenuación, etc., para una mejor comprensión e interiorización de la teoría estudiada. Por tanto, la aplicación, implementada en MATLAB, fruto de la realización de este PFC pretende servir de complemento práctico para las asignaturas dedicadas al estudio de las comunicaciones ópticas a estudiantes en un entorno amigable e intuitivo. Optical Communications in the field of Telecommunications Engineering continues to grow. Its applications, initially dedicated to large central lines that link the switching currently achieved, as mentioned, to the same household nowadays. Progress in this field, with a relentless succession, not only destined to increase the transmission capacity of the systems, but to broaden the diversity of the processes that are performed on the signals in the optical domain. This demands to professionals reviewing and updating their skills to enable them resolve issues easily. Moreover, in recent years the importance of optical communications is also reflected in the different degrees of Telecommunications Engineering, whose curriculum contemplates this area. Often, the information sources available to tackle this discipline mainly theoretical orientation. Engineering professionals and students are faced this matter are few topics discussing complex physical phenomena, and abstract concepts abundant with a flowery mathematical apparatus, but often wotput a practical, important in engineering, and that is what is required of students and engineers: knowing how to solve problems and issues related to optical communications. Optical communications systems, particularly those using optical fiber as a medium for transmission of information, as stated, are reaching a significant development in the field of telecommunications. The advantages offered by the fiber, well known and referred to in the preceding paragraph (high bandwidth, immunity to electromagnetic disturbances of origin and production of non interference, low attenuation, etc..), have made today, is one of the fields of information and communication technology that this increased interest by scientists, engineers, telecommunications operators and, of course, users. Given this reality, the purpose and justification of this project is not other than to bring this technology to the future telecommunications engineer, and / or anyone with a passing interest in this subject, and showing of a practical and various visual phenomena occurring in the transmission of information by optical fiber, as well as different blocks and devices in which said communication is divided. To achieve that objective, the final project presented here has as its mission the development of a graphical user interface (GUI) that allows the user to configure each of the blocks in which divided a point-to-point optical fiber. Each block into which this link will consist of several options to choose and configure it as you like, this will change the behavior of the system and will present to the user with the different phenomena occurring in an optical communication system, such as noise, dispersion, attenuation, etc., for better understanding and internalization of the theory studied. Therefore, the application, implemented in MATLAB, the result of the completion of the thesis is intended to complement practical subjects for the study of optical communications students in a friendly and intuitive environment.