925 resultados para Weighted histogram analysis method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Goals of work: The diagnosis and treatment of a brain tumour may result in long-term changes in a patient's functional and social abilities and/or in a greatly reduced life span. A qualitative investigation was conducted to examine the supportive care needs of patients with brain tumour and their carers. Materials and methods: Overall, 18 patients and 18 carers participated in focus groups or telephone interviews, following a structured interview guide to elicit supportive care services of importance to these patients and carers. Main results: Six major themes were identified using the framework analysis method, including needs for information and coping with uncertainty, practical support, support to return to pretreatment responsibilities or prepare for long-term care, support to deal with social isolation and organize respite care, support to overcome stigma/discrimination and support to discuss potentially reduced life expectancy. Conclusions: Five recommendations to improve service delivery include: assignment of a dedicated member of the care team or case manager; proactive dissemination of information, education and psychosocial support; access to objective assessment of neuropsychological functioning; facilitating easier access to welfare payments; and services facilitating communication about difficult illness-related topics. Provision of services along these recommendations could improve supportive care of brain tumour patients and their carers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Segundo dados do INEP (2007) temos cerca de 8.866 cursos de lato sensu no Brasil e, consequentemente, o número de docentes que atuam no ensino superior se torna um crescente, se observarmos os dados do Censo da Educação Superior de 2010, onde foi registrado 345.335 vínculos de funções docentes em exercício nas instituições de educação superior. Baseados nestes dados surgem as seguintes questões: quais são os saberes e as práticas pedagógicas destes docentes diante do ensino no lato sensu? Como os professores que atuam neste segmento, se constituem docentes, tendo em vista que nem sempre a sua formação é pedagógica? A formação para a atuação nesta modalidade é prevista no Artigo 66 da Lei de Diretrizes e Bases da Educação Nacional, que afirma que a preparação para o exercício no magistério superior se fará em nível de pós-graduação, prioritariamente em Programas de Mestrado e/ou Doutorado e sabemos que não há, especificamente, uma indicação de formação pedagógica para a docência. Assim, a investigação sobre as questões acima anunciadas se realizou por meio de uma pesquisa sobre a trajetória formativa e profissional de docentes que atuam no lato sensu, para identificarmos se a falta de formação pedagógica influencia/influenciou seus saberes e práticas docentes, como construíram sua identidade de professor ao longo de sua carreira, e sobre qual a real motivação para tornar-se professor. Os sujeitos da pesquisa são quatro professores que atuam em uma Instituição particular em São Paulo/Capital. Foram realizadas entrevistas semi-estruturadas para a coleta de dados e organização de categorias de análises por meio da metodologia de análise de conteúdo. Esta pesquisa teve também uma contextualização histórica sobre a legislação do lato sensu e uma discussão teórica sobre saberes e práticas docentes direcionadas para o ensino superior. A análise dos dados confirmou a hipótese de que os professores que atuam no lato sensu resgatam seus saberes e práticas de experiências vivenciadas quando eram alunos, tomando como modelo referências que tiveram ao longo de sua trajetória formativa e profissional. Foi constatado que na maioria das vezes, o professor só procura a formação pedagógica após já estar inserido no contexto da atuação no lato sensu, dependendo quase que exclusivamente, de uma busca por iniciativa própria, uma vez que a legislação não pré-determina qual o perfil deste professor para atuação no lato sensu. Constituem-se docentes de maneira gradual e a formação pedagógica é mais uma possibilidade para a atuação, uma vez que em primeiro lugar vem a experiência da prática profissional de sua área de formação de origem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the previous studies on intellectual capital disclosures have been conducted from developed countries' context. There is very limited empirical evidence in this area from the context of emerging economies in general and Africa in particular. This paper is one of the early attempts in this regard. The main purpose of this study is to examine the extent and nature of intellectual capitaldisclosures in ‘Top 20’ South African companies over a 5 years period (2002–2006). The study uses content analysis method to scrutinise the patterns of intellectual capital disclosures during the study period. The results show that intellectual capital disclosures in South Africa have increased over the 5 years study period with certain firms reporting considerably more than others. Out of the three broad categories of intellectual capital disclosures human capital appears to be the most popular category. This finding stands in sharp contrast to the previous studies in this area where external capital was found to be most popular category.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Most published surface wettability data are based on hydrated materials and are dominated by the air-water interface. Water soluble species with hydrophobic domains (such as surfactants) interact directly with the hydrophobic domains in the lens polymer. Characterisation of relative polar and non-polar fractions of the dehydrated material provides an additional approach to surface analysis. Method: Probe liquids (water and diiodomethane) were used to characterise polar and dispersive components of surface energies of dehydrated lenses using the method of Owens and Wendt. A range of conventional and silicone hydrogel soft lenses was studied. The polar fraction (i.e. polar/total) of surface energy was used as a basis for the study of the structural effects that influence surfactant persistence on the lens surface. Results: When plotted against water content of the hydrated lens, polar fraction of surface energy (PFSE) values of the dehydrated lenses fell into two rectilinear bands. One of these bands covered PFSE values ranging from 0.4 to 0.8 and contained only conventional hydrogels, with two notable additions: the plasma coated silicone hydrogels lotrafilcon A and B. The second band covered PFSE values ranging from 0.04 to 0.28 and contained only silicone hydrogels. Significantly, the silicone hydrogel lenses with lowest PFSE values (p<0.15) are found to be prone to lipid deposition duringwear. Additionally, more hydrophobic surfactants were found to be more persistent on lenses with lower PFSE values. Conclusions: Measurement of polar fraction of surface energy provides an importantmechanistic insight into surface interactions of silicone hydrogels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The major challenge of MEG, the inverse problem, is to estimate the very weak primary neuronal currents from the measurements of extracranial magnetic fields. The non-uniqueness of this inverse solution is compounded by the fact that MEG signals contain large environmental and physiological noise that further complicates the problem. In this paper, we evaluate the effectiveness of magnetic noise cancellation by synthetic gradiometers and the beamformer analysis method of synthetic aperture magnetometry (SAM) for source localisation in the presence of large stimulus-generated noise. We demonstrate that activation of primary somatosensory cortex can be accurately identified using SAM despite the presence of significant stimulus-related magnetic interference. This interference was generated by a contact heat evoked potential stimulator (CHEPS), recently developed for thermal pain research, but which to date has not been used in a MEG environment. We also show that in a reduced shielding environment the use of higher order synthetic gradiometry is sufficient to obtain signal-to-noise ratios (SNRs) that allow for accurate localisation of cortical sensory function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing assignment problems for assigning n jobs to n individuals are limited to the considerations of cost or profit measured as crisp. However, in many real applications, costs are not deterministic numbers. This paper develops a procedure based on Data Envelopment Analysis method to solve the assignment problems with fuzzy costs or fuzzy profits for each possible assignment. It aims to obtain the points with maximum membership values for the fuzzy parameters while maximizing the profit or minimizing the assignment cost. In this method, a discrete approach is presented to rank the fuzzy numbers first. Then, corresponding to each fuzzy number, we introduce a crisp number using the efficiency concept. A numerical example is used to illustrate the usefulness of this new method. © 2012 Operational Research Society Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) has been proven as an excellent data-oriented efficiency analysis method for comparing decision making units (DMUs) with multiple inputs and multiple outputs. In conventional DEA, it is assumed that the status of each measure is clearly known as either input or output. However, in some situations, a performance measure can play input role for some DMUs and output role for others. Cook and Zhu [Eur. J. Oper. Res. 180 (2007) 692–699] referred to these variables as flexible measures. The paper proposes an alternative model in which each flexible measure is treated as either input or output variable to maximize the technical efficiency of the DMU under evaluation. The main focus of this paper is on the impact that the flexible measures has on the definition of the PPS and the assessment of technical efficiency. An example in UK higher education intuitions shows applicability of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To examine the academic literature on the grading of corneal transparency and to assess the potential use of objective image analysis. Method: Reference databases of academic literature were searched and relevant manuscripts reviewed. Annunziato, Efron (Millennium Edition) and Vistakon-Synoptik corneal oedema grading scale images were analysed objectively for relative intensity, edges detected, variation in intensity and maximum intensity. In addition, corneal oedema was induced in one subject using a low oxygen transmissibility (Dk/t) hydrogel contact lens worn for 3 hours under a light eye patch. Recovery from oedema was monitored over time using ultrasound pachymetry, high and low contrast visual acuity measures, bulbar hyperaemia grading and transparency image analysis of the test and control eyes. Results: Several methods for assessing corneal transparency are described in the academic literature, but none have gained widespread in clinical practice. The change in objective image analysis with printed scale grade was best described by quadratic parametric or sigmoid 3-parameter functions. ‘Pupil image scales’ (Annunziato and Vistakon-Synoptik) were best correlated to average intensity; however, the corneal section scale (Efron) was strongly correlated to variations in intensity. As expected, patching an eye wearing a low Dk/t hydrogel contact lens caused a significant (F=119.2, P<0.001) 14.3% increase in corneal thickness, which gradually recovered under open eye conditions. Corneal section image analysis was the most affected parameter and intensity variation across the slit width, in isolation, was the strongest correlate, accounting for 85.8% of the variance with time following patching, and 88.7% of the variance with corneal thickness. Conclusion: Corneal oedema is best determined objectively by the intensity variation across the width of a corneal section. This can be easily measured using a slit-lamp camera connected to a computer. Oedema due to soft contact lens wear is not easily determined over the pupil area by sclerotic scatter illumination techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we suggest the technology of creation of intelligent tutoring systems which are oriented to teach knowledge. It is supposed the acquisition of expert’s knowledge by using of the Formal Concept Analysis method, then construction the test questions which are used for verification of the pupil's knowledge with the expert’s knowledge. Then the further tutoring strategy is generated by the results of this verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A test protocol and a data analysis method are developed in this paper on the basis of linear viscoelastic theory to characterize the anisotropic viscoelastic properties of undamaged asphalt mixtures. The test protocol includes three nondestructive tests: (1) uniaxial compressive creep test, (2) indirect tensile creep test, and (3) the uniaxial tensile creep test. All three tests are conducted on asphalt mixture specimens at three temperatures (10, 20, and 30°C) to determine the tensile and compressive properties at each temperature and then to construct the master curve of each property. The determined properties include magnitude and phase angle of the compressive complex modulus in the vertical direction, magnitude and phase angle of the tensile complex modulus, and the magnitude and phase angle of the compressive complex modulus in the horizontal plane. The test results indicate that all tested asphalt mixtures have significantly different tensile properties from compressive properties. The peak value of the master curve of the tensile complex modulus phase angle is within a range from 65 to 85°, whereas the peak value of the compressive moduli phase angle in both directions ranges from 35 to 55°. In addition, the undamaged asphalt mixtures exhibit distinctively anisotropic properties in compression. The magnitude of the compressive modulus in the vertical direction is approximately 1.2 to ̃2 times of the magnitude of the compressive modulus in the horizontal plane. Dynamic modulus tests are performed to verify the results of the proposed test protocol. The test results from the proposed test protocol match well with those from the dynamic tests. © 2012 American Society of Civil Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite widespread recognition of the problem of adolescent alcohol and other drug (AOD) abuse, research on its most common treatment modality, group work, is lacking. This research gap is alarming given that outcomes range from positive to potentially iatrogenic. This study sought to identify change mechanisms and/or treatment factors that are observable within group treatment sessions and that may predict AOD use outcomes. This NIH (F31 DA 020233-01A1) study evaluated 108, 10-19 year olds and the 19 school-based treatment groups to which they were previously assigned (R01 AA10246; PI: Wagner). Associations between motivational interviewing (MI) based change talk variables, group leader MI skills, and alcohol and marijuana use outcomes up to 12-months following treatment were evaluated. Treatment session audio recordings and transcripts (1R21AA015679-01; PI: Macgowan) were coded using a new discourse analysis coding scheme for measuring group member change talk (Amrhein, 2003). Therapist MI skills were similarly measured using the Motivational Interviewing Treatment Integrity instrument. Group member responses to commitment predicted group marijuana use at the 1-month follow up. Also, group leader empathy was significantly associated with group commitment for marijuana use at the middle and ending stages of treatment. Both of the above process measures were applied in a group setting for the first time. Building upon MI and social learning theory principles, group commitment and group member responses to commitment are new observable, in-session, process constructs that may predict positive and negative adolescent group treatment outcomes. These constructs, as well as the discourse analysis method and instruments used to measure them, raise many possibilities for future group work process research and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.