866 resultados para Limitation of Actions
Resumo:
Rock mass is widely recognized as a kind of geologic body which consists of rock blocks and discontinuities. The deformation and failure of rock mass is not only determined by rock block,but also by discontinuity which is virtually more important. Mutual cutting and combination of discontinuities controlled mechanical property of rock mass. The complex cutting of discontinuities determine the intense anisotropy on mechanical property of rock mass,especially under the effect of ground stress. Engineering practice has show that the brittle failure of hard rock always occurs when its working stress is far lower than the yield strength and compressive strength,the failure always directly related to the fracture propagation of discontinuities. Fracture propagation of discontinuities is the virtue of hard rock’s failure. We can research the rock mass discontinuous mechanical properties precisely by the methods of statistical analysis of discontinuities and Fracture Mechanics. According to Superposition Principle in Fracture Mechanics,A Problem or C Problem could be chosen to research. Problem A mainly calculates the crack-tip stress field and displacement field on internal discontinuities by numerical method. Problem C calculate the crack-tip stress field and displacement field under the assumption of that the mainly rock mass stress field has been known. So the Problem C avoid the complex mutual interference of stress fields of discontinuities,which is called crack system problem in Fracture Mechanics. To solve Problem C, field test on stress field in the rock mass is needed. The linear Superposition of discontinuities strain energies are Scientific and Rational. The difference of Fracture Mechanics between rock mass and other materials can mostly expression as:other materials Fracture Mechanics mostly face the problem A,and can’t avoid multi-crack puzzle, while the Rock mass Fracture Mechanics answer to the Problem C. Problem C can avoid multi-discontinuities mutual interference puzzle via the ground stress test. On the basis of Problem C, Fracture Mechanics could be used conveniently in rock mass. The rock mass statistics fracture constitutive relations, which introduced in this article, are based on the Problem C and the Discontinuity Strain Energy linear superposition. This constitutive relation has several merits: first, it is physical constitutive relation rather than empirical; second, it is very fit to describe the rock mass anisotropy properties; third, it elaborates the exogenous factors such as ground stress. The rock mass statistics fracture constitutive relation is the available approach to answer to the physical, anisotropic and ground stress impacted rock mass problems. This article stand on the foundation of predecessor’s statistics fractures constitutive relation, and improved the discontinuity distributive function. This article had derived the limitation of negative exponential distribution in the course of regression analysis, and advocated to using the two parameter negative exponential distribution for instead. In order to solve the problems of two-dimension stability on engineering key cross-sectional view in rock mass, this article derived the rock mass planar flexibility tensor, and established rock mass two-dimension penetrate statistics fracture constitutive relation on the basis of penetrate fracture mechanics. Based on the crack tip plasticity research production of penetrate fracture, for example the Irwin plasticity equifinality crack, this article established the way to deal with the discontinuity stress singularity and plastic yielding problem at discontinuity tip. The research on deformation parameters is always the high light region of rock mass mechanics field. After the dam foundation excavation of XiaoWan hydroelectric power station, dam foundation rock mass upgrowthed a great deal of unload cracks, rock mass mechanical property gotten intricacy and strong anisotropy. The dam foundation rock mass mostly upgrowthed three group discontinuities: the decantation discontinuity, the steep pitch discontinuity, and the schistosity plane. Most of the discontinuities have got partial unload looseness. In accordance with ground stress field data, the dam foundation stress field greatly non-uniform, which felled under the great impaction of tectonic stress field, self-weight stress field, excavation geometric boundary condition, and excavation, unload. The discontinuity complexity and stress field heterogeneity, created the rock mass mechanical property of dam foundation intricacy and levity. The research on the rock mass mechanics, if not take every respected influencing factor into consideration as best as we can, major errors likely to be created. This article calculated the rock mass elastic modulus that after Xiao Wan hydroelectric power station dam foundation gutter excavation finished. The calculation region covered possession monolith of Xiao Wan concrete double-curvature arch dam. Different monolith were adopted the penetrate fracture statistics constitutive relation or bury fracture statistics constitutive relation selectively. Statistics fracture constitutive relation is fit for the intensity anisotropy and heterogeneity rock mass of Xiao Wan hydroelectric power station dam foundation. This article had contrastive analysis the statistics fracture constitutive relation result with the inclined plane load test actual measurement elastic modulus and RMR method estimated elastic modulus, and find that the three methods elastic modulus have got greatly comparability. So, the statistics fracture constitutive relations are qualified for trust. Generally speaking,this article had finished following works based on predecessors job: “Argumentation the C Problems of superposition principle in Fracture Mechanics, establish two-dimension penetrate statistics fracture constitutive relation of rock mass, argue the negative exponential distribution limitation and improve it, improve of the three-dimension berry statistics fracture constitutive relation of rock mass, discontinuity-tip plastic zone isoeffect calculation, calculate the rock mass elastic modulus on two-dimension cross-sectional view”. The whole research clue of this article inherited from the “statistics rock mass mechanics” of Wu Faquan(1992).
Resumo:
It has been a difficult problem faced by seismologists for long time that how exactly to reconstruct the earth's geometric structure and distribution of physical attributes according to seismic wave's kinematical and dynamic characteristics, obtained in seismological observation. The jointing imaging of seismic reflector and anisotropy attributes in the earth interior is becoming the research hot spot. The limitation of shoot and observation system makes that the obtained seismic data are too scarce to exactly reconstruct the geological objects. It is popular that utilizing only seismic reflection traveltimes or polarizations information make inversion of the earth's velocity distribution by fixing seismic reflector configuration (vice versa), these will lead to the serious non-uniqueness reconstruction due to short of effective data, the non-uniqueness problem of reconstructing anisotropy attributes will be more serious than in isotropy media. Obviously it is not enough to restrict the media structure only by information of seismic reflection traveltimes or polarizations, which even sometimes will lead to distorted images and misinterpretation of subsurface structure. So we try to rebuild seismic reflection structure (geometry) and media anisotropic structure (physics) in the earth interior by jointing data of seismic wave kinematics and dynamics characteristics, we carry out the new experiment step by step, and the research mainly comprises of two parts: one is the reconstruction of P-wave vertical velocity and anisotropic structure(Thomsen parameter s and 8) in the transversely isotropic media with vertical symmetrical axis(VTI) by fixing geometrical structure, and the other is the simultaneous inversion of the reflector surface conformation and seismic anisotropic structure by jointing seismic reflection traveltimes and polarizations data. Simulated annealing method is used to the first research part, linear inversion based on BG theory and Simulated annealing are applied to the second one. All the research methods are checked by model experiments, then applied to the real data of the wide-angle seismic profile from Tunxi, Anhui Province, to Wenzhou, Zhejiang Province. The results are as following The inversion results based on jointing seismic PP-wave or PSV-wavereflection traveltimes and polarizations data are more close to real model than themodels based simply on one of the two data respectively. It is shown that the methodwe present here can effectively reconstruct the anisotropy attributes in the earth'sinterior when seismic reflector structure is fixed.The layer thickness, P-wave vertical velocity and Thomsen anisotropicparameters {s and 8) could be resolved simultaneously by jointing inversion ofseismic reflection traveltimes and polarizations with the linear inversion methodbased on BG theory.The image of the reflector structure, P-wave vertical velocity and theanisotropy parameters in the crust could be obtained from the wide-angle seismicprofile from Tunxi (in Anhui Province), to Wenzhou (in Zhejiang Province). Theresults reveal the difference of the reflector geometrical structure and physicalattributes in the crust between Yangtze block and Cathaysia block, and attempt tounderstand the characteristics of the crustal stress field in the areas.
Resumo:
The decision making of customers has been a great concern in the field of customer research. Although China has entered the era of brand consumption and development, due to the different understanding of the regarded attributes between companies and customers, the phenomenon of “The awarded products don’t sell well, but the products which sell well can’t get the award.” appears. At the same time there is little research on the relationship between the brand and the customers has been conducted in China now. Traditional research on customer psychology employ questionnaires, depth interview and group discussions as the major methods. In cognitive psychology, the limitation of explicit memory has been revealed by implicit memory; moreover, unconscious cognition and implicit memory can also influence customers' remark of the brand. Therefore, the traditional methods are not accurate enough. Reaction time is an effective way to reveal testing equality, and it can also reveal implicit cognition. Based on the researches intends to investigate the validity of attention attributes in the method of reaction time by questionnaires and time reaction testing of 360 customers in 3 cities, which may, probably, overcomes the limitation of the traditional research methods. The 352 valid samples were analyzed by SPSS. The results showed there was no distinct corresponding relationship between the product attributes and reaction time. The different key attributes from questionnaire importance rating and the shortest reaction time standards were used to regressively analyze the results of customers’ overall rating (such as overall satisfaction,objective quality, recommend intention).The results indicated that the coefficiency of regression of the special attributes chosen from reaction time to overall rating was distinct, while the coefficiency of the special attributes chosen from importance rating to overall rating was not. The main conclusions are: 1. Regarded attributes can be obtained by the reaction time of brand performance rating. 2. Regarded attributes obtained by the reaction time of brand performance rating are more accurate than those by importance rating questionnaires. 3. The brand’s core attributes should includes regarded attributes during the decision making process.
Resumo:
The relationship between working memory (WM) and attention has attracted many researchers. In his embedded-processes model of WM, Cowan (1999, 2001) uses the term focus of attention to refer to the core component of WM, and proposes that the focus of attention of WM and that of perception have the same span, which is a fixed number. This hypothesis about the scope of attention has seldom been tested, although considerable studies have revealed that WM and attention have overlapping mechanisms. The present dissertation tests this hypothesis by examining the dual-task interference between Corsi Blocks Task (CBT) and Mutilple Object Tracking (MOT) and the findings demenstrate that Cowan’s hypothesis is not exactly true. The results of our first study show that the interference effect of MOT on CBT is a reliable indicator of whether and to which extent the attentional resoureces of WM and perception overlap. In the second study we find that the capacity of the common resources is not a fixed number but varies with the difficulty of control of attention. And the third study indicates that attentional resources used in WM and perception are partly independent, the overlapping part can attend to only one or two items or locations at a time. These findings can contribute to future studies on the capacity limitation of different cognitive functions, and to the development of relevant ability tests.
Resumo:
This dissertation systematically depicted and improved the application of Independent Component Analysis (ICA) to Functional Magnetic Resonance Imaging (fMRI), following the logic of verification, improvement, extension, and application. The concept of “reproducibility” was the philosophy throughout its four concluded studies. In the “verification” study, ICA was applied to the resting-state fMRI data, verified the resultant components with reproducibility, and examined the consistency of the results from ICA and traditional “seed voxel” method. At the meantime, the limitation of ICA application on fMRI data analysis was presented. In the “improvement” study, an improved ICA algorithm based on reproducibility, RAICAR, was developed to aid some of the limitations of ICA application. RAICAR was able to rank ICA components by reproducibility, determine the number of reliable components, and obtain more stable results. RAICAR provided useful tools for validation and interpretation of ICA results. In the “extension” study, RAICAR as well as the concept of “reproducibility” was extended to multi-subject ICA analysis, and gRAICAR algorithm was developed. gRAICAR allows some variation across subjects, examining common components among subjects. gRAICAR is also capable to detect potential subject grouping on some components. It is a new way for exploratory group analysis on fMRI. In the “application” study, two newly developed methods, RAICAR and gRAICAR, were used to investigate the effect of early music training on the brain mechanism of memory and learning. The results showed brain mechanism difference in memory retrieval and learning process between two groups of subjects. This study also verified the usefulness and importance of the new methods.
Resumo:
This thesis has investigated the risk preferences of the Chinese company managers in kinds of simulated decision situations and their perceptions of risk concerning types of business decisions. Four studies are conducted: Study I is utility analysis. 214 company managers and 46 middle - school headmasters have responded to Utility Measurement Survey. The results indicate: (1) The risk preferences of the managers vary in the different decision situations. In most of the situations, most of the managers are risk aversion; In few situations, they are risk-seeking. (2) In some of the decision situations, there are significant differences on risk preference between business managers and school headmasters, male managers and female managers, senior managers and junior managers, managers with high qualifications and managers with low qualifications, non-state-owned firms' managers and state-owned firms' managers, medium-small sized firms' managers and large-sized firms' managers. In the other situations there aren't significant differences between them. (3) In all of the decision situations, so significant differences on risk preference are found among managers with different marriage, experience, age and education. Study II is risky decision simulation. The Risky Decision Situations Simulation Survey is administered to 82 company managers. The result indicates that firm culture, business condition, survival limit and risk preference of the superior influence the managers' risk decision-making behavior. Study III is perceptions of business decision risks. 68 company managers have filled in Decision Cases Risk Perception Inventory. The results indicate: (1) Inaccurate market analysis and prediction, instable politics and the changes of economic policy are the more risky elements to strategy decision. (2) Erroneous market analysis and prediction, appearance of new technology and the changes of market demands are the more risky elements to investment decision. (3) Poor quality control, backward technology and too large stocks are the more risky elements to production decision. (4) Shortage of development fund, wrong choice in development project and limitation of the development ability are the more risky elements to new production development decision. (5) No payment of the foreign partner's capital, the changes of national relevant policy, difficulty in marketing, too high selling prices of foreign partner's equipments are the more risky elements to joint-venture decision. (6) Unfamilarity with oneself and misjudgement in qualification of oneself are the more risky elements to personnel decision. (7) Bad market of the product, defects in product quality and the changes of consumers demands are the more risky elements to marketing decision. (8) Wrong strategy and ambiguous goals are the more risky elements to public relation decision. (9) Violation of the law, ambiguous goals and poor creation are the more risky elements to advertisement decision. (10) Deterioration of diplomatic relations, unsuitable products for foreign consumers and unfamilarity with foreign market are the more risky elements to international business decision. Study IV is structured interview. 5 company managers have answered all questions of the Interview Questionnaire. The results indicate: (1) The managers think that risks are the possible unfavourable consequences of decisions; (2) The self-ratings of the managers coordinate with the results of utility measurement; (3) The managers admit that risks always accompany bussiness decision; (4) Individual difference is found among managers on risk perception. This thesis has also pointed out the important implications of the research and discussed several further questions.
Resumo:
Schema acquisition is one of the mechanisms of learning. How to design reasonable teaching material to promote schema acquisition is an important question that psychological researchers and educators both interested. Cognitive Load Theory indicates that: The cognitive resource of Human is limited, the organization and presentation of the learning material should avoid demanding the learner consume resource in actions that have nothing to do with schema acquisition. How can we do that? Sweller. J. et think: Increasing the operation cost of the learning material would make the students put more resource into the implementation of the operation, this kind of resource consuming has nothing to do with schema acquisition. So, in order to make the students put more resource into actions which relating to schema acquisition, we should decrease the operation cost of the learning material. But, the research results of O'Hara et indicate: In problem-solving of knowledge lean field, increasing the operation cost would make the college students invent more resource to plan and understanding actions. So, Increasing the operation cost would facilitate the schema acquisition. How operation cost will effect the Middle-School Students' (MSS) schema acquisition and resource distribution when they solve problems of knowledge lean/rich field? This is the main question this research want to make inquiry. IN this research, we use three experiments indicate: Increasing the operation cost of actions, the implementing action would be less and the planning action would be more. So, increasing the operation cost can promote the schema acquisition. We use "cost-benefit analysis" strategy to explain this result. This strategy means that: Human is rational, before doing one action, he will weigh the cost and the coming benefit of this action, if the coming benefit is higher than the cost, he will implement this action; if the cost is higher than the coming benefit, this action will be contained. On the one hand, this research further affirms the core opinion of the Cognitive Load Theory: Human's cognitive resource is limited, we should put the limited resource into actions which is related to the schema acquisition; On the other hand, for the learning material designing principle which is advanced by the Cognitive Load Theory, we raise our questions. Besides, the question we raised holds some identical views with the constructive learning opinion: Learning is not passive information absorption, but positively constructing the meaning of the information, besides, this kind of construction can't done by others. The result of this research can provide some theory guidance and experimental basis for the designing of the MSS's science teaching material from a complete new angle.
Resumo:
Businesses interact constantly with the environment, realizing several and heterogeneous exchanges. Organizations can be considered a system of different interests, frequently conflicting and the satisfaction of different stakeholders is a condition of success and survival. National and international literature attempts to explain the complex connection between companies and environment. In particular, the Stakeholder Theory considers crucial for businesses the identification of different stakeholders and their involvement in decision-making process. In this context, profit can not be considered the only purpose of companies existence and business aims become more numerous and different. The Stakeholder Theory is often utilized as framework for tourism studies, in particular in Sustainable Tourism Development research. In fact, authors consider sustainable the tourism development able to satisfy interests of different stakeholders, traditionally identified as local community and government, businesses, tourists and natural environment. Tourism businesses have to guarantee the optimal use of natural resources, the respect of socio-cultural tradition of local community and the creation of socio-economic benefits for all stakeholders in destinations. An obstacle to sustainable tourism development that characterizes a number of destinations worldwide is tourism demand seasonality. In fact, its negative impact on the environment, economy and communities may be highly significant. Pollution, difficulties in the use of public services, stress for residents, seasonal incomes, are all examples of the negative effects of seasonality. According to the World Tourism Organization (2004) the limitation of seasonality can favour the sustainability of tourism. Literature suggests private and public strategies to minimize the negative effects of tourism seasonality, as diversification of tourism products, identification of new market segments, launching events, application of public instruments like eco-taxes and use of differential pricing policies. Revenue Management is a managerial system based on differential pricing and able to affect price sensitive tourists. This research attempts to verify if Revenue Management, created to maximize profits in tourism companies, can also mitigate the seasonality of tourism demand, producing benefits for different stakeholders of destinations and contributing to Sustainable Tourism Development. In particular, the study attempts to answer the following research questions: 1) Can Revenue Management control the flow of tourist demand? 2) Can Revenue Management limit seasonality, producing benefits for different stakeholders of a destination? 3) Can Revenue Management favor the development of Sustainable Tourism? The literature review on Stakeholder Theory, Sustainable Tourism Development, tourism seasonality and Revenue Management forms the foundation of the research, based on a case study approach looking at a significant destination located in the Southern coast of Sardinia, Italy. A deductive methodology was applied and qualitative and quantitative methods were utilized. This study shows that Revenue Management has the potential to limit tourism seasonality, to mitigate negative impacts occurring from tourism activities, producing benefits for local community and to contribute to Sustainable Tourism Development.
Resumo:
R. Jensen and Q. Shen. Fuzzy-Rough Sets Assisted Attribute Selection. IEEE Transactions on Fuzzy Systems, vol. 15, no. 1, pp. 73-89, 2007.
Resumo:
R. Jensen and Q. Shen, 'Tolerance-based and Fuzzy-Rough Feature Selection,' Proceedings of the 16th International Conference on Fuzzy Systems (FUZZ-IEEE'07), pp. 877-882, 2007.
Resumo:
J. Keppens, Q. Shen and M. Lee. Compositional Bayesian modelling and its application to decision support in crime investigation. Proceedings of the 19th International Workshop on Qualitative Reasoning, pages 138-148.
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
Temporal structure is skilled, fluent action exists at several nested levels. At the largest scale considered here, short sequences of actions that are planned collectively in prefronatal cortex appear to be queued for performance by a cyclic competitive process that operates in concert with a parallel analog representation that implicitly specifies the relative priority of elements of the sequence. At an intermediate scale, single acts, like reaching to grasp, depend on coordinated scaling of the rates at which many muscles shorten or lengthen in parallel. To ensure success of acts such as catching an approaching ball, such parallel rate scaling, which appears to be one function of the basal ganglia, must be coupled to perceptual variables such as time-to-contact. At a finer scale, within each act, desired rate scaling can be realized only if precisely timed muscle activations first accelerate and then decelerate the limbs, to ensure that muscle length changes do not under- or over- shoot the amounts needed for precise acts. Each context of action may require a different timed muscle activation pattern than similar contexts. Because context differences that require different treatment cannot be known in advance, a formidable adaptive engine-the cerebellum-is needed to amplify differences within, and continuosly search, a vast parallel signal flow, in order to discover contextual "leading indicators" of when to generate distinctive patterns of analog signals. From some parts of the cerebellum, such signals control muscles. But a recent model shows how the lateral cerebellum may serve the competitive queuing system (frontal cortex) as a repository of quickly accessed long-term sequence memories. Thus different parts of the cerebellum may use the same adaptive engine design to serve the lowest and highest of the three levels of temporal structure treated. If so, no one-to-one mapping exists between leveels of temporal structure and major parts of the brain. Finally, recent data cast doubt on network-delay models of cerebellar adaptive timing.
Resumo:
Introduction: The prevalence of diabetes is rising rapidly. Assessing quality of diabetes care is difficult. Lower Extremity Amputation (LEA) is recognised as a marker of the quality of diabetes care. The focus of this thesis was first to describe the trends in LEA rates in people with and without diabetes in the Republic of Ireland (RoI) in recent years and then, to explore the determinants of LEA in people with diabetes. While clinical and socio-demographic determinants have been well-established, the role of service-related factors has been less well-explored. Methods: Using hospital discharge data, trends in LEA rates in people with and without diabetes were described and compared to other countries. Background work included concordance studies exploring the reliability of hospital discharge data for recording LEA and diabetes and estimation of diabetes prevalence rates in the RoI from a nationally representative study (SLAN 2007). To explore determinants, a systematic review and meta-analysis assessed the effect of contact with a podiatrist on the outcome of LEA in people with diabetes. Finally, a case-control study using hospital discharge data explored determinants of LEA in people with diabetes with a particular focus on the timing of access to secondary healthcare services as a risk factor. Results: There are high levels of agreement between hospital discharge data and medical records for LEA and diabetes. Thus, hospital discharge data was deemed sufficiently reliable for use in this PhD thesis. A decrease in major diabetes-related LEA rates in people with diabetes was observed in the RoI from 2005-2012. In 2012, the relative risk of a person with diabetes undergoing a major LEA was 6.2 times (95% CI 4.8-8.1) that of a person without diabetes. Based on the systematic review and meta-analysis, contact with a podiatrist did not significantly affect the relative risk (RR) of LEA in people with diabetes. Results from the case-control study identified being single, documented CKD and documented hypertension as significant risk factors for LEA in people with diabetes whilst documented retinopathy was protective. Within the seven year time window included in the study, no association was detected between LEA in patients with diabetes and timing of patient access to secondary healthcare for diabetes management. Discussion: Many countries have reported reduced major LEA rates in people with diabetes coinciding with improved organisation of healthcare systems. Reassuringly, these first national estimates in people with diabetes in the RoI from 2005 to 2012 demonstrated reducing trends in major LEA rates. This may be attributable to changes in diabetes care and also, secular trends in smoking, dyslipidaemia and hypertension. Consistent with international practice, LEA trends data in Ireland can be used to monitor quality of care. Quantifying this improvement precisely, though, is problematic without robust denominator data on the prevalence of diabetes. However, a reduction in major diabetes-related LEA rates suggests improved quality of diabetes care. Much controversy exists around the reliability of hospital discharge data in the RoI. This thesis includes the first multi-site study to explore this issue and found hospital discharge data reliable for the reporting of the procedure of LEA and diagnosis of diabetes. This project did not detect protective effects of access to services including podiatry and secondary healthcare for LEA in people with diabetes. A major limitation of the systematic review and meta-analysis was the design and quality of the included studies. The data available in the area of effect of contact with a podiatrist on LEA risk are too sparse to say anything definitive about the efficacy of podiatry on LEA. Limitations of the case-control study include lack of a diabetes register in Ireland, restricted information from secondary healthcare and lack of data available from primary healthcare. Due to these issues, duration of disease could not be accounted for in the study which limits the conclusions that can be drawn from the results. The model of diabetes care in the RoI is currently undergoing a re-configuration with plans to introduce integrated care. In the future, trends in LEA rates should be continuously monitored to evaluate the effectiveness of changes to the healthcare system. Efforts are already underway to improve the availability of routine data from primary healthcare with the recent development of the iPCRN (Irish Primary Care Research Network). Linkage of primary and secondary healthcare records with a unique patient identifier should be the goal for the future.