928 resultados para Shared component model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to propose a framework of ethics education that promotes the structured learning of ethics in the accounting discipline. The Ethics Education Framework (EEF) is based on three key inter-related components that include: Rest’s (1986) Four-Component Model of ethical decision-making and behaviour; the key cognitive and behavioural objectives of ethics education; and the discrete and pervasive approaches to delivering content. The EEF provides university students and professional accountants a structure to learn to identify, analyse and resolve ethical issues, to the point of action. The EEF is a four-stage learning continuum represented as a set of building blocks which introduces ethical concepts and then reinforces and develops new levels of understanding with progressive stages. This paper describes the EEF, and includes a discussion of how it compares with other ethics education models, and an analysis of the support through responses by professional organisations (based on an Exposure Draft issued by the International Federation of Accountants (IFAC), as the initial International Education Practice Statement). The IFAC has now revised its International Education Standard (IES 4) in relation to ethics, with a commentary period till July 2011.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article identifies cultural models of osteoporosis, as shared by community-dwelling older women in southeastern Australia, and compares these with cultural knowledge conveyed through social marketing. Cultural models are mental constructs about specific domains in everyday life, such as health and illness, which are shared within a community. We applied domain analyses to data obtained from in-depth interviews and stakeholder-identified print materials. The response domains identified from our case studies made up the shared cultural model “Osteoporosis has low salience,” particularly when ranked against other threats to health. The cultural knowledge reflected in the print materials supported a cultural model of low salience. Cultural cues embedded in social marketing messages on osteoporosis may be internalized and motivating in unintended ways. Identifying and understanding cultural models of osteoporosis within a community may provide valuable insights to inform the development of targeted health messages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current bio-kinematic encoders use velocity, acceleration and angular information to encode human exercises. However, in exercise physiology there is a need to distinguish between the shape of the trajectory and its execution dynamics. In this paper we propose such a two-component model and explore how best to compute these components of an action. In particular, we show how a new spatial indexing scheme, derived directly from the underlying differential geometry of curves, provides robust estimates of the shape and dynamics compared to standard temporal indexing schemes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main objective of this paper is to propose a novel setup that allows estimating separately the welfare costs of the uncertainty stemming from business-cycle uctuations and from economic-growth variation, when the two types of shocks associated with them (respectively,transitory and permanent shocks) hit consumption simultaneously. Separating these welfare costs requires dealing with degenerate bivariate distributions. Levis Continuity Theorem and the Disintegration Theorem allow us to adequately de ne the one-dimensional limiting marginal distributions. Under Normality, we show that the parameters of the original marginal distributions are not afected, providing the means for calculating separately the welfare costs of business-cycle uctuations and of economic-growth variation. Our empirical results show that, if we consider only transitory shocks, the welfare cost of business cycles is much smaller than previously thought. Indeed, we found it to be negative - -0:03% of per-capita consumption! On the other hand, we found that the welfare cost of economic-growth variation is relatively large. Our estimate for reasonable preference-parameter values shows that it is 0:71% of consumption US$ 208:98 per person, per year.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we decompose the variance of logarithmic monthly earnings of prime age males into its permanent and transitory components, using a five-wave rotating panel from the Venezuelan “Encuesta de Hogares por Muestreo” from 1995 to 1997. As far as we know, this is the first time a variance components model is estimated for a developing country. We test several specifications and find that an error component model with individual random effects and first order serially correlated errors fits the data well. In the simplest model, around 22% of earnings variance is explained by the variance of permanent component, 77% by purely stochastic variation and the remaining 1% by serial correlation. These results contrast with studies from industrial countries where the permanent component is predominant. The permanent component is usually interpreted as the results of productivity characteristics of individuals whereas the transitory component is due to stochastic perturbations such as job and/or price instability, among others. Our findings may be due to the timing of the panel when occurred precisely during macroeconomic turmoil resulting from a severe financial crisis. The findings suggest that earnings instability is an important source of inequality in a region characterized by high inequality and macroeconomic instability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work addresses the relationship between University-Firm aims to understand the model of shared management of R&D in petroleum of Petrobras with UFRN. This is a case study which sought to investigate whether the model of cooperation established by the two institutions brings innovation to generate technical-scientific knowledge and contribute to the coordination with other actors in the promotion of technological innovation. In addition to desk research the necessary data for analysis were obtained by sending questionnaires to the coordinators of projects in R&D at the company and university. Also, interviews were conducted with subjects who participated in the study since its inception to the present day. This case study were analysed through the Resource-Based View and Interorganizational Networks theory. The sample data also stands that: searches were aligned to the strategic planning and that 29% of R&D projects have been successful on the scope of the proposed objectives (of which 11% were incorporated into business processes); which was produced technical and scientific knowledge caracterized by hundreds of national and international publications; thesis, dissertations, eleven patents, and radical and incremental innovations; the partnership has also brought benefits to the academic processes induced by the improved infrastructure UFRN and changing the "attitude" of the university (currently with national prominence in research and staff training for the oil sector). As for the model, the technical point of view, although it has some problems, it follows that it is appropriate. From the viewpoint of the management model is criticized for containing an excess of bureaucracy. From the standpoint of strategic allocation of resources from the legal framework needs to be reassessed, because it is focused only on the college level and it is understood that should also reach the high school given the new reality of the oil sector in Brazil. For this it is desirable to add the local government to this partnership. The set of information leads to the conclusion that the model is identified and named as a innovation of organizational arrangement here known as Shared Management of R&D in petroleum of Petrobras with UFRN. It is said that the shared management model it is possible to exist, which is a simple and effective way to manage partnerships between firms and Science and Technology Institutions. It was created by contingencies arising from regulatory stand points and resource dependence. The partnership is the result of a process of Convergence, Construction and Evaluation supported by the tripod Simplicity, Systematization and Continuity, important factors for its consolidation. In practice an organizational arrangement was built to manage innovative university-industry partnership that is defined by a dyadic relationship on two levels (institutional and technical, therefore governance is hybrid), by measuring the quarterly meetings of systematic and standardized financial contribution proportional to the advancement of research. These details have led to the establishment of a point of interaction between the scientific and technological-business dimension, demystifying they are two worlds apart

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among the traits of economic importance to dairy cattle livestock those related to sexual precocity and longevity of the herd are essential to the success of the activity, because the stayability time of a cow in a herd is determined by their productive and reproductive lives. In Brazil, there are few studies about the reproductive efficiency of Swiss-Brown cows and no study was found using the methodology of survival analysis applied to this breed. Thus, in the first chapter of this study, the age at first calving from Swiss-Brown heifers was analyzed as the time until the event by the nonparametric method of Kaplan-Meier and the gamma shared frailty model, under the survival analysis methodology. Survival and hazard rate curves associated with this event were estimated and identified the influence of covariates on such time. The mean and median times at the first calving were 987.77 and 1,003 days, respectively, and significant covariates by the Log-Rank test, through Kaplan-Meier analysis, were birth season, calving year, sire (cow s father) and calving season. In the analysis by frailty model, the breeding values and the frailties of the sires (fathers) for the calving were predicted modeling the risk function of each cow as a function of the birth season as fixed covariate and sire as random covariate. The frailty followed the gamma distribution. Sires with high and positive breeding values possess high frailties, what means shorter survival time of their daughters to the event, i.e., reduction in the age at first calving of them. The second chapter aimed to evaluate the longevity of dairy cows using the nonparametric Kaplan-Meier and the Cox and Weibull proportional hazards models. It were simulated 10,000 records of the longevity trait from Brown-Swiss cows involving their respective times until the occurrence of five consecutive calvings (event), considered here as typical of a long-lived cow. The covariates considered in the database were age at first calving, herd and sire (cow s father). All covariates had influence on the longevity of cows by Log-Rank and Wilcoxon tests. The mean and median times to the occurrence of the event were 2,436.285 and 2,437 days, respectively. Sires that have higher breeding values also have a greater risk of that their daughters reach the five consecutive calvings until 84 months

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The constant increase of complexity in computer applications demands the development of more powerful hardware support for them. With processor's operational frequency reaching its limit, the most viable solution is the use of parallelism. Based on parallelism techniques and the progressive growth in the capacity of transistors integration in a single chip is the concept of MPSoCs (Multi-Processor System-on-Chip). MPSoCs will eventually become a cheaper and faster alternative to supercomputers and clusters, and applications developed for these high performance systems will migrate to computers equipped with MP-SoCs containing dozens to hundreds of computation cores. In particular, applications in the area of oil and natural gas exploration are also characterized by the high processing capacity required and would benefit greatly from these high performance systems. This work intends to evaluate a traditional and complex application of the oil and gas industry known as reservoir simulation, developing a solution with integrated computational systems in a single chip, with hundreds of functional unities. For this, as the STORM (MPSoC Directory-Based Platform) platform already has a shared memory model, a new distributed memory model were developed. Also a message passing library has been developed folowing MPI standard

Relevância:

80.00% 80.00%

Publicador:

Resumo:

On the last years, several middleware platforms for Wireless Sensor Networks (WSN) were proposed. Most of these platforms does not consider issues of how integrate components from generic middleware architectures. Many requirements need to be considered in a middleware design for WSN and the design, in this case, it is possibility to modify the source code of the middleware without changing the external behavior of the middleware. Thus, it is desired that there is a middleware generic architecture that is able to offer an optimal configuration according to the requirements of the application. The adoption of middleware based in component model consists of a promising approach because it allows a better abstraction, low coupling, modularization and management features built-in middleware. Another problem present in current middleware consists of treatment of interoperability with external networks to sensor networks, such as Web. Most current middleware lacks the functionality to access the data provided by the WSN via the World Wide Web in order to treat these data as Web resources, and they can be accessed through protocols already adopted the World Wide Web. Thus, this work presents the Midgard, a component-based middleware specifically designed for WSNs, which adopts the architectural patterns microkernel and REST. The microkernel architectural complements the component model, since microkernel can be understood as a component that encapsulates the core system and it is responsible for initializing the core services only when needed, as well as remove them when are no more needed. Already REST defines a standardized way of communication between different applications based on standards adopted by the Web and enables him to treat WSN data as web resources, allowing them to be accessed through protocol already adopted in the World Wide Web. The main goals of Midgard are: (i) to provide easy Web access to data generated by WSN, exposing such data as Web resources, following the principles of Web of Things paradigm and (ii) to provide WSN application developer with capabilities to instantiate only specific services required by the application, thus generating a customized middleware and saving node resources. The Midgard allows use the WSN as Web resources and still provide a cohesive and weakly coupled software architecture, addressing interoperability and customization. In addition, Midgard provides two services needed for most WSN applications: (i) configuration and (ii) inspection and adaptation services. New services can be implemented by others and easily incorporated into the middleware, because of its flexible and extensible architecture. According to the assessment, the Midgard provides interoperability between the WSN and external networks, such as web, as well as between different applications within a single WSN. In addition, we assessed the memory consumption, the application image size, the size of messages exchanged in the network, and response time, overhead and scalability on Midgard. During the evaluation, the Midgard proved satisfies their goals and shown to be scalable without consuming resources prohibitively

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multimedia systems must incorporate middleware concepts in order to abstract hardware and operational systems issues. Applications in those systems may be executed in different kinds of platforms, and their components need to communicate with each other. In this context, it is needed the definition of specific communication mechanisms for the transmission of information flow. This work presents a interconnection component model for distributed multimedia environments, and its implementation details. The model offers specific communication mechanisms for transmission of information flow between software components considering the Cosmos framework requirements in order to support component dynamic reconfiguration

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multimedia systems must incorporate middleware concepts in order to abstract hardware and operational systems issues. Applications in those systems may be executed in different kinds of platforms, and their components need to communicate with each other. In this context, it is needed the definition of specific communication mechanisms for the transmission of information flow. This work presents a interconnection component model for distributed multimedia environments, and its implementation details. The model offers specific communication mechanisms for transmission of information flow between software components considering the Cosmos framework requirements in order to support component dynamic reconfiguration

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduzione L’importanza clinica, sociale ed economica del trattamento dell’ipertensione arteriosa richiede la messa in opera di strumenti di monitoraggio dell’uso dei farmaci antiipertensivi che consentano di verificare la trasferibilità alla pratica clinica dei dati ottenuti nelle sperimentazioni. L’attuazione di una adatta strategia terapeutica non è un fenomeno semplice da realizzare perché le condizioni in cui opera il Medico di Medicina Generale sono profondamente differenti da quelle che si predispongono nell’esecuzione degli studi randomizzati e controllati. Emerge, pertanto, la necessità di conoscere le modalità con cui le evidenze scientifiche trovano reale applicazione nella pratica clinica routinaria, identificando quei fattori di disturbo che, nei contesti reali, limitano l’efficacia e l’appropriatezza clinica. Nell’ambito della terapia farmacologica antiipertensiva, uno di questi fattori di disturbo è costituito dalla ridotta aderenza al trattamento. Su questo tema, recenti studi osservazionali hanno individuato le caratteristiche del paziente associate a bassi livelli di compliance; altri hanno focalizzato l’attenzione sulle caratteristiche del medico e sulla sua capacità di comunicare ai propri assistiti l’importanza del trattamento farmacologico. Dalle attuali evidenze scientifiche, tuttavia, non emerge con chiarezza il peso relativo dei due diversi attori, paziente e medico, nel determinare i livelli di compliance nel trattamento delle patologie croniche. Obiettivi Gli obiettivi principali di questo lavoro sono: 1) valutare quanta parte della variabilità totale è attribuibile al livello-paziente e quanta parte della variabilità è attribuibile al livello-medico; 2) spiegare la variabilità totale in funzione delle caratteristiche del paziente e in funzione delle caratteristiche del medico. Materiale e metodi Un gruppo di Medici di Medicina Generale che dipendono dall’Azienda Unità Sanitaria Locale di Ravenna si è volontariamente proposto di partecipare allo studio. Sono stati arruolati tutti i pazienti che presentavano almeno una misurazione di pressione arteriosa nel periodo compreso fra il 01/01/1997 e il 31/12/2002. A partire dalla prima prescrizione di farmaci antiipertensivi successiva o coincidente alla data di arruolamento, gli assistiti sono stati osservati per 365 giorni al fine di misurare la persistenza in trattamento. La durata del trattamento antiipertensivo è stata calcolata come segue: giorni intercorsi tra la prima e l’ultima prescrizione + proiezione, stimata sulla base delle Dosi Definite Giornaliere, dell’ultima prescrizione. Sono stati definiti persistenti i soggetti che presentavano una durata del trattamento maggiore di 273 giorni. Analisi statistica I dati utilizzati per questo lavoro presentano una struttura gerarchica nella quale i pazienti risultano “annidati” all’interno dei propri Medici di Medicina Generale. In questo contesto, le osservazioni individuali non sono del tutto indipendenti poiché i pazienti iscritti allo stesso Medico di Medicina Generale tenderanno ad essere tra loro simili a causa della “storia comune” che condividono. I test statistici tradizionali sono fortemente basati sull’assunto di indipendenza tra le osservazioni. Se questa ipotesi risulta violata, le stime degli errori standard prodotte dai test statistici convenzionali sono troppo piccole e, di conseguenza, i risultati che si ottengono appaiono “impropriamente” significativi. Al fine di gestire la non indipendenza delle osservazioni, valutare simultaneamente variabili che “provengono” da diversi livelli della gerarchia e al fine di stimare le componenti della varianza per i due livelli del sistema, la persistenza in trattamento antiipertensivo è stata analizzata attraverso modelli lineari generalizzati multilivello e attraverso modelli per l’analisi della sopravvivenza con effetti casuali (shared frailties model). Discussione dei risultati I risultati di questo studio mostrano che il 19% dei trattati con antiipertensivi ha interrotto la terapia farmacologica durante i 365 giorni di follow-up. Nei nuovi trattati, la percentuale di interruzione terapeutica ammontava al 28%. Le caratteristiche-paziente individuate dall’analisi multilivello indicano come la probabilità di interrompere il trattamento sia più elevata nei soggetti che presentano una situazione clinica generale migliore (giovane età, assenza di trattamenti concomitanti, bassi livelli di pressione arteriosa diastolica). Questi soggetti, oltre a non essere abituati ad assumere altre terapie croniche, percepiscono in minor misura i potenziali benefici del trattamento antiipertensivo e tenderanno a interrompere la terapia farmacologica alla comparsa dei primi effetti collaterali. Il modello ha inoltre evidenziato come i nuovi trattati presentino una più elevata probabilità di interruzione terapeutica, verosimilmente spiegata dalla difficoltà di abituarsi all’assunzione cronica del farmaco in una fase di assestamento della terapia in cui i principi attivi di prima scelta potrebbero non adattarsi pienamente, in termini di tollerabilità, alle caratteristiche del paziente. Anche la classe di farmaco di prima scelta riveste un ruolo essenziale nella determinazione dei livelli di compliance. Il fenomeno è probabilmente legato ai diversi profili di tollerabilità delle numerose alternative terapeutiche. L’appropriato riconoscimento dei predittori-paziente di discontinuità (risk profiling) e la loro valutazione globale nella pratica clinica quotidiana potrebbe contribuire a migliorare il rapporto medico-paziente e incrementare i livelli di compliance al trattamento. L’analisi delle componenti della varianza ha evidenziato come il 18% della variabilità nella persistenza in trattamento antiipertensivo sia attribuibile al livello Medico di Medicina Generale. Controllando per le differenze demografiche e cliniche tra gli assistiti dei diversi medici, la quota di variabilità attribuibile al livello medico risultava pari al 13%. La capacità empatica dei prescrittori nel comunicare ai propri pazienti l’importanza della terapia farmacologica riveste un ruolo importante nel determinare i livelli di compliance al trattamento. La crescente presenza, nella formazione dei medici, di corsi di carattere psicologico finalizzati a migliorare il rapporto medico-paziente potrebbe, inoltre, spiegare la relazione inversa, particolarmente evidente nella sottoanalisi effettuata sui nuovi trattati, tra età del medico e persistenza in trattamento. La proporzione non trascurabile di variabilità spiegata dalla struttura in gruppi degli assistiti evidenzia l’opportunità e la necessità di investire nella formazione dei Medici di Medicina Generale con l’obiettivo di sensibilizzare ed “educare” i medici alla motivazione ma anche al monitoraggio dei soggetti trattati, alla sistematica valutazione in pratica clinica dei predittori-paziente di discontinuità e a un appropriato utilizzo della classe di farmaco di prima scelta. Limiti dello studio Uno dei possibili limiti di questo studio risiede nella ridotta rappresentatività del campione di medici (la partecipazione al progetto era su base volontaria) e di pazienti (la presenza di almeno una misurazione di pressione arteriosa, dettata dai criteri di arruolamento, potrebbe aver distorto il campione analizzato, selezionando i pazienti che si recano dal proprio medico con maggior frequenza). Questo potrebbe spiegare la minore incidenza di interruzioni terapeutiche rispetto a studi condotti, nella stessa area geografica, mediante database amministrativi di popolazione. Conclusioni L’analisi dei dati contenuti nei database della medicina generale ha consentito di valutare l’impiego dei farmaci antiipertensivi nella pratica clinica e di stabilire la necessità di porre una maggiore attenzione nella pianificazione e nell’ottenimento dell’obiettivo che il trattamento si prefigge. Alla luce dei risultati emersi da questa valutazione, sarebbe di grande utilità la conduzione di ulteriori studi osservazionali volti a sostenere il progressivo miglioramento della gestione e del trattamento dei pazienti a rischio cardiovascolare nell’ambito della medicina generale.