853 resultados para Probabilistic decision process model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

When people use generic masculine language instead of more gender-inclusive forms, they communicate gender stereotypes and sometimes exclusion of women from certain social roles. Past research related gender-inclusive language use to sexist beliefs and attitudes. Given that this aspect of language use may be transparent to users, it is unclear whether people explicitly act on these beliefs when using gender-exclusive language forms or whether these are more implicit, habitual patterns. In two studies with German-speaking participants, we showed that spontaneous use of gender-inclusive personal nouns is guided by explicitly favorable intentions as well as habitual processes involving past use of such language. Further indicating the joint influence of deliberate and habitual processes, Study 2 revealed that language-use intentions are embedded in explicit sexist ideologies. As anticipated in our decision-making model, the effects of sexist beliefs on language emerged through deliberate mechanisms involving attitudes and intentions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was designed to investigate and describe the relationship among resilience, forgiveness and anger expression in adolescents. The purpose of the study was to explore whether certain adolescent resiliencies significantly related to positive or negative affective, behavioral, or cognitive levels of forgiveness and certain types of anger expression in adolescents. This study also investigated whether there were certain adolescent resiliencies and types of forgiveness that can predict lower levels of negative anger expression in adolescents. This research was built on two conceptual models: Wolin and Wolin's (1993) Challenge Model and the Forgiveness Process Model (Enright & Human Development Study Group, 1991). It was based on a quantitative, single-subject correlational research design. A multiple regression analysis was also used to explore possible effects of resilience and forgiveness on anger expression in adolescents. In addition, two demographic variables, Age and Gender, were examined for possible effects on anger expression. Data were gathered from a convenience sample sample of 70 students in three Maine public high schools using three separate assessment instruments: the Adolescent Resiliency Attitudes Scale (ARAS), the Adolescent Version of the Enright Forgiveness Inventory (EFI), and the Adolescent Anger Rating Scale (AARS). Correlational analyses were done on the scales and subscales of these surveys. Significant relationships were found between several adolescent resiliencies and forms of forgiveness as well as between some adolescent resiliencies and types of anger expression. The data indicated that Total Resiliency significantly correlated with Total Forgiveness as well as Total Anger. The findings also identified particular adolescent resiliencies that significantly predicted types of anger expression, while forgiveness did not predict types of anger expression. The data revealed that Age and Gender had no significant affect on anger expression. These findings suggest that the constructs of adolescent resilience and forgiveness have commonalities that can influence how adolescents express anger, and further suggest that intervention and prevention programs expand their focus to incorporate forgiveness skills. The findings from this study can provide critical information to counselors, therapists, and other helping professionals working with adolescents, on approaches to designing and implementing therapy modalities or developmental school guidance programs for adolescents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the demanding environment of healthcare reform, reduction of unwanted physician practice variation is promoted, often through evidence-based guidelines. Guidelines represent innovations that direct change(s) in physician practice; however, compliance has been disappointing. Numerous studies have analyzed guideline development and dissemination, while few have evaluated the consequences of guideline adoption. The primary purpose of this study was to explore and analyze the relationship between physician adoption of the glycated hemoglobin test guideline for management of adult patients with diabetes, and the cost of medical care. The study also examined six personal and organizational characteristics of physicians and their association with innovativeness, or adoption of the guideline. ^ Cost was represented by approved charges from a managed care claims database. Total cost, and diabetes and related complications cost, first were compared for all patients of adopter physicians with those of non-adopter physicians. Then, data were analyzed controlling for disease severity based on insulin dependency, and for high cost cases. There was no statistically significant difference in any of eight cost categories analyzed. This study represented a twelve-month period, and did not reflect cost associated with future complications known to result from inadequate management of glycemia. Guideline compliance did not increase annual cost, which, combined with the future benefit of glycemic control, lends support to the cost effectiveness of the guideline in the long term. Physician adoption of the guideline was recommended to reduce the future personal and economic burden of this chronic disease. ^ Only half of physicians studied had adopted the glycated hemoglobin test guideline for at least 75% of their diabetic patients. No statistically significant relationship was found between any physician characteristic and guideline adoption. Instead, it was likely that the innovation-decision process and guideline dissemination methods were most influential. ^ A multidisciplinary, multi-faceted approach, including interventions for each stage of the innovation-decision process, was proposed to diffuse practice guidelines more effectively. Further, it was recommended that Organized Delivery Systems expand existing administrative databases to include clinical information, decision support systems, and reminder mechanisms, to promote and support physician compliance with this and other evidence-based guidelines. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers how the multinational corporation's transfer price responds to changes in international corporate effective tax rates. It extends the decentralized decision-making analysis of transfer pricing in the context of different tax rates. It adopts and extends Bond's (1980) model of the decentralized multinational corporation that assumes centralized transfer pricing. The direction of transfer price change is as expected, while the magnitude of change is likely to be less than predicted by the Horst (1971), centralized decision-making model. The paper extends the model further by assuming negotiated transfer pricing, where the analysis is partitioned into perfect and imperfect information cases. The negotiated transfer pricing result reverts to the Horst (1971), or centralized decision-making, result, under perfect information. Under imperfect information, the centralized decision-making result obtains when top management successfully informs division general managers or it successfully implements a non-monetary reward scheme to encourage division general managers to cooperate. Under simplifying assumptions, centralized decision-making dominates decentralized decision-making, while negotiated transfer pricing weakly dominates centralized transfer pricing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Colorectal cancer (CRC) has become a public health concern due to the underutilization of the various screening methods. There is a need to understand a patient's decision making process in regards to their health and obtaining the appropriate screening. Previous research has defined patient autonomy in two dimensions: The patient's involvement in the decision making process and their desire to be informed (Ende, Kazis, Ash, & Moskowitz, 1989). Past research shows that patients have a high desire to be informed, but a low desire to be involved in the medical decision process. Deber, Kraetschmer, and Irvine (1996) developed a measure which consisted of two subscales that measures patients' involvement: Patient's desire to be involved in the problem solving (PS) and decision making (DM) process. Little research has examined the desire for involvement and decision making of Latino populations. The present study sought to investigate the psychometric properties of the Deber et al. (1996) measure. In general, Latino patients in the present sample had low desire for autonomy in health decisions or to be involved in the decision making processes of their health related issues. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Center for Disease Control and Prevention (CDC) estimates that more than 2 million patients annually acquire an infection while hospitalized in U.S. hospitals for other health problems, and that 88,000 die as a direct or indirect result of these infections. Infection with Clostridium difficile is the most important common cause of health care associated infectious diarrhea in industrialized countries. The purpose of this study was to explore the cost of current treatment practice of beginning empiric metronidazole treatment for hospitalized patients with diarrhea prior to identification of an infectious agent. The records of 70 hospitalized patients were retrospectively analyzed to determine the pharmacologic treatment, laboratory testing, and radiographic studies ordered and the median cost for each of these was determined. All patients in the study were tested for C. difficile and concurrently started on empiric metronidazole. The median direct cost for metronidazole was $7.25 per patient (95% CI 5.00, 12.721). The median direct cost for laboratory charges was $468.00 (95% CI 339.26, 552.58) and for radiology the median direct cost was $970.00 (95% CI 738.00, 3406.91). Indirect costs, which are far greater than direct costs, were not studied. At St. Luke's, if every hospitalized patient with diarrhea was empirically treated with metronidazole at a median cost of $7.25, the annual direct cost is estimated to be over $9,000.00 plus uncalculated indirect costs. In the U.S., the estimated annual direct cost may be as much as $21,750,000.00, plus indirect costs. ^ An unexpected and significant finding of this study was the inconsistency in testing and treatment of patients with health care associated diarrhea. A best-practice model for C. difficile testing and treatment was not found in the literature review. In addition to the cost savings gained by not routinely beginning empiric treatment with metronidazole, significant savings and improvement in patient care may result from a more consistent approach to the diagnosis and treatment of all patients with health care associated diarrhea. A decision tree model for C. difficile testing and treatment is proposed, but further research is needed to evaluate the decision arms before a validated best practice model can be proposed. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is growing interest in providing women with internatal care, a package of healthcare and ancillary services that can improve their health during the period after the termination of one pregnancy but before the conception of the next pregnancy. Women who have had a pregnancy affected by a neural tube defect can especially benefit from internatal care because they are at increased risk for recurrence and improvements to their health during the inter-pregnancy period can prevent future negative birth outcomes. The dissertation provides three papers that inform the content of internatal care for women at risk for recurrence by examining descriptive epidemiology to develop an accurate risk profile of the population, assessing whether women at risk for recurrence would benefit from a psychosocial intervention, and determining how to improve health promotion efforts targeting folic acid use.^ Paper one identifies information relevant for developing risk profiles and conducting risk assessments. A number of investigations have found that the risk for neural tube defects differs between non-Hispanic Whites and Hispanics. To understand the risk difference, the descriptive epidemiology of spina bifida and anencephaly was examined for Hispanics and non-Hispanic Whites based on data from the Texas Birth Defects Registry for the years 1999 through 2004. Crude and adjusted birth prevalence ratios and corresponding 95% confidence intervals were calculated between descriptive epidemiologic characteristics and anencephaly and spina bifida for non-Hispanic Whites and for Hispanics. In both race/ethnic groups, anencephaly expressed an inverse relationship with maternal age and a positive linear relationship with parity. Both relationships were stronger in non-Hispanic Whites. Female infants had a higher risk for anencephaly in non-Hispanic Whites. Lower maternal education was associated with increased risk for spina bifida in Hispanics.^ Paper two assesses the need for a psychosocial intervention. For mothers who have children with spina bifida, the transition to motherhood can be stressful. This qualitative study explored the process of becoming a mother to a child with spina bifida focusing particularly on stress and coping in the immediate postnatal environment. Semi-structured interviews were conducted with six mothers who have children with spina bifida. Mothers were asked about their initial emotional and problem-based coping efforts, the quality and kind of support provided by health providers, and the characteristics of their meaning-based coping efforts; questions matched Transactional Model of Stress and Coping (TMSC) constructs. Analysis of the responses revealed a number of modifiable stress and coping transactions, the most salient being: health providers are in a position to address beliefs about self-causality and prevent mothers from experiencing the repercussions that stem from maintaining these beliefs. ^ Paper three identifies considerations when creating health promotion materials targeting folic acid use. A brochure was designed using concepts from the Precaution Adoption Process Model (PAPM). Three focus groups comprising 26 mothers of children with spina bifida evaluated the brochure. One focus group was conducted in Spanish-only, the other two focus groups were conducted in English and Spanish combined. Qualitative analysis of coded transcripts revealed that a brochure is a helpful adjunct. Questions about folic acid support the inclusion of an insert with basic information. There may be a need to develop different educational material for Hispanics so the importance of folic acid is provided in a situational context. Some participants blamed themselves for their pregnancy outcome which may affect their receptivity to messages in the brochure. The women's desire for photographs that affect their perception of threat and their identification with the second role model indicate they belong to PAPM Stage 2 and 3. Participants preferred colorful envelopes, high quality paper, intimidating photographs, simple words, conversational style sentences, and positive messages.^ These papers develop the content of risk assessment, psychosocial intervention, and health promotion components of internatal care as they apply to women at risk for recurrence. The findings provided evidence for considering parity and maternal age when assessing nutritional risk. The two dissimilarities between the two race/ethnic groups, infant sex and maternal education lent support to creating separate risk profiles. Interviews with mothers of children with spina bifida revealed the existence of unmet needs-suggesting that a psychosocial intervention provided as part of internatal care can strengthen and support women's well-being. Segmenting the audience according to race/ethnicity and PAPM stage can improve the relevance of print materials promoting folic acid use.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is an implementation analysis of three consecutive state health policies whose goal was to improve access to maternal and child health services in Texas from 1983 to 1986. Of particular interest is the choice of the unit of analysis, the policy subsystem, and the network approach to analysis. The network approach analyzes and compares the structure and decision process of six policy subsystems in order to explain program performance. Both changes in state health policy as well as differences in implementation contexts explain evolution of the program administrative and service unit, the policy subsystem. And, in turn, the evolution of the policy subsystem explains changes in program performance. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation focuses on Project HOPE, an American medical aid agency, and its work in Tunisia. More specifically this is a study of the implementation strategies of those HOPE sponsored projects and programs designed to solve the problems of high morbidity and infant mortality rates due to environmentally related diarrheal and enteric diseases. Several environmental health programs and projects developed in cooperation with Tunisian counterparts are described and analyzed. These include (1) a paramedical manpower training program; (2) a national hospital sanitation and infection control program; (3) a community sewage disposal project; (4) a well reconstruction project; and (5) a solid-waste disposal project for a hospital.^ After independence, Tunisia, like many developing countries, encountered several difficulties which hindered progress toward solving basic environmental health problems and prompted a request for aid. This study discusses the need for all who work in development programs to recognize and assess those difficulties or constraints which affect the program planning process, including those latent cultural and political constraints which not only exist within the host country but within the aid agency as well. For example, failure to recognize cultural differences may adversely affect the attitudes of the host staff towards their work and towards the aid agency and its task. These factors, therefore, play a significant role in influencing program development decisions and must be taken into account in order to maximize the probability of successful outcomes.^ In 1969 Project HOPE was asked by the Tunisian government to assist the Ministry of Health in solving its health manpower problems. HOPE responded with several programs, one of which concerned the training of public health nurses, sanitary technicians, and aids at Tunisia's school of public health in Nabeul. The outcome of that program as well as the strategies used in its development are analyzed. Also, certain questions are addressed such as, what should the indicators of success be, and when is the time right to phase out?^ Another HOPE program analyzed involved hospital sanitation and infection control. Certain generic aspects of basic hospital sanitation procedures were documented and presented in the form of a process model which was later used as a "microplan" in setting up similar programs in other Tunisian hospitals. In this study the details of the "microplan" are discussed. The development of a nation-wide program without any further need of external assistance illustrated the success of HOPE's implementation strategies.^ Finally, although it is known that the high incidence of enteric disease in developing countries is due to poor environmental sanitation and poor hygiene practices, efforts by aid agencies to correct these conditions have often resulted in failure. Project HOPE's strategy was to maximize limited resources by using a systems approach to program development and by becoming actively involved in the design and implementation of environmental health projects utilizing "appropriate" technology. Three innovative projects and their implementation strategies (including technical specifications) are described.^ It is advocated that if aid agencies are to make any progress in helping developing countries basic sanitation problems, they must take an interdisciplinary approach to progrm development and play an active role in helping counterparts seek and identify appropriate technologies which are socially and economically acceptable. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Objective assessment of motor skills has become an important challenge in minimally invasive surgery (MIS) training.Currently, there is no gold standard defining and determining the residents' surgical competence.To aid in the decision process, we analyze the validity of a supervised classifier to determine the degree of MIS competence based on assessment of psychomotor skills METHODOLOGY: The ANFIS is trained to classify performance in a box trainer peg transfer task performed by two groups (expert/non expert). There were 42 participants included in the study: the non-expert group consisted of 16 medical students and 8 residents (< 10 MIS procedures performed), whereas the expert group consisted of 14 residents (> 10 MIS procedures performed) and 4 experienced surgeons. Instrument movements were captured by means of the Endoscopic Video Analysis (EVA) tracking system. Nine motion analysis parameters (MAPs) were analyzed, including time, path length, depth, average speed, average acceleration, economy of area, economy of volume, idle time and motion smoothness. Data reduction was performed by means of principal component analysis, and then used to train the ANFIS net. Performance was measured by leave one out cross validation. RESULTS: The ANFIS presented an accuracy of 80.95%, where 13 experts and 21 non-experts were correctly classified. Total root mean square error was 0.88, while the area under the classifiers' ROC curve (AUC) was measured at 0.81. DISCUSSION: We have shown the usefulness of ANFIS for classification of MIS competence in a simple box trainer exercise. The main advantage of using ANFIS resides in its continuous output, which allows fine discrimination of surgical competence. There are, however, challenges that must be taken into account when considering use of ANFIS (e.g. training time, architecture modeling). Despite this, we have shown discriminative power of ANFIS for a low-difficulty box trainer task, regardless of the individual significances between MAPs. Future studies are required to confirm the findings, inclusion of new tasks, conditions and sample population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Autonomous systems require, in most of the cases, reasoning and decision-making capabilities. Moreover, the decision process has to occur in real time. Real-time computing means that every situation or event has to have an answer before a temporal deadline. In complex applications, these deadlines are usually in the order of milliseconds or even microseconds if the application is very demanding. In order to comply with these timing requirements, computing tasks have to be performed as fast as possible. The problem arises when computations are no longer simple, but very time-consuming operations. A good example can be found in autonomous navigation systems with visual-tracking submodules where Kalman filtering is the most extended solution. However, in recent years, some interesting new approaches have been developed. Particle filtering, given its more general problem-solving features, has reached an important position in the field. The aim of this thesis is to design, implement and validate a hardware platform that constitutes itself an embedded intelligent system. The proposed system would combine particle filtering and evolutionary computation algorithms to generate intelligent behavior. Traditional approaches to particle filtering or evolutionary computation have been developed in software platforms, including parallel capabilities to some extent. In this work, an additional goal is fully exploiting hardware implementation advantages. By using the computational resources available in a FPGA device, better performance results in terms of computation time are expected. These hardware resources will be in charge of extensive repetitive computations. With this hardware-based implementation, real-time features are also expected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the necessity to undertake activities, every year people increase their standards of travelling (distance and time). Urban sprawl development plays an important role in these "enlargements". Thus, governments invest money in an exhaustiva search for solutions to high levels of congestion and car-trips. The complex relationship between urban environment and travel behaviour has been studied in a number of cases. Thus, the objective of this paper is to answer the important question of which land-use attributes influence which dimensions of travel behaviour, and to verify to what extent specific urban planning measures affect the individual decision process, by exhaustiva statistical and systematic tests. This paper found that a crucial issue in the analysis of the relationship between the built environment and travel behaviour is the definition of indicators. As such, we recommend a feasible list of indicators to analyze this relationship.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper contributes with a unified formulation that merges previ- ous analysis on the prediction of the performance ( value function ) of certain sequence of actions ( policy ) when an agent operates a Markov decision process with large state-space. When the states are represented by features and the value function is linearly approxi- mated, our analysis reveals a new relationship between two common cost functions used to obtain the optimal approximation. In addition, this analysis allows us to propose an efficient adaptive algorithm that provides an unbiased linear estimate. The performance of the pro- posed algorithm is illustrated by simulation, showing competitive results when compared with the state-of-the-art solutions.