884 resultados para Theories and models
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
As a discipline, supply chain management (SCM) has traditionally been primarily concerned with the procurement, processing, movement and sale of physical goods. However an important class of products has emerged - digital products - which cannot be described as physical as they do not obey commonly understood physical laws. They do not possess mass or volume, and they require no energy in their manufacture or distribution. With the Internet, they can be distributed at speeds unimaginable in the physical world, and every copy produced is a 100% perfect duplicate of the original version. Furthermore, the ease with which digital products can be replicated has few analogues in the physical world. This paper assesses the effect of non-physicality on one such product – software – in relation to the practice of SCM. It explores the challenges that arise when managing the software supply chain and how practitioners are addressing these challenges. Using a two-pronged exploratory approach that examines the literature around software management as well as direct interviews with software distribution practitioners, a number of key challenges associated with software supply chains are uncovered, along with responses to these challenges. This paper proposes a new model for software supply chains that takes into account the non-physicality of the product being delivered. Central to this model is the replacement of physical flows with flows of intellectual property, the growing importance of innovation over duplication and the increased centrality of the customer in the entire process. Hybrid physical / digital supply chains are discussed and a framework for practitioners concerned with software supply chains is presented.
Resumo:
This paper critically evaluates the paradigm, theory, and methodology that dominate research on related party transactions (RPTs). RPTs have been debated in the literature whether they are a facet of conflict of interest between major and minor shareholders or they are normal efficient transactions that help the firms to achieve asset utilization. Literature has been widely interested in studying the association between corporate governance and RPTs especially that according to the agency theory it is assumed that corporate governance as a monitoring tool should impede the negative consequences of RPTs and ensure they are conducted to achieve better asset utilization.
Resumo:
AMS Subj. Classification: 47J10, 47H30, 47H10
Resumo:
Jelen tanulmány a posztmodern kor fogyasztási tendenciáit és a posztmodern marketing sajátos fejlődését elemzi, elsősorban a turizmus példáján. A szerzők a hazai és a nemzetközi szakirodalom, illetve saját kutatásaik és megfigyeléseik alapján ütköztetik az ismert és elfogadott elveket, elméleteket a gyakorlattal, és felhívják a figyelmet a marketingtevékenység alkalmazkodásának hazai problémáira. A Vezetéstudomány című folyóirat 2008/9. számában rendkívül érdekes tanulmány jelent meg Mitev Ariel Zoltán és Horváth Dóra tollából „A posztmodern marketing rózsaszirmai” címmel. A tanulmány előremutató, érdekfeszítő és minden tekintetben konstruktív, újszerű. Jelen tanulmány szerzőire is nagy hatást gyakorolt a cikk, nagyrészt felsorolt erényei miatt, de egyes esetekben kiegészítést kívánva. Mindenképpen inspirálta a továbblépést, az újabb adalékok megfogalmazását, amire ezúton e tanulmány szerzői kísérletet tettek. A cikk egyben szerves gondolati folytatása a szerzőpáros korábbi közös publikációinak, elsősorban a Marketing & Menedzsment folyóiratban megjelent cikknek. _______ In this article the author will analyze consumption tendencies of post-modern age, mainly using tourism marketing examples. Their analysis has been based on results of their own researches and researches published in Hungarian and international marketing literature. In this article they try to confront different theories of post-modern marketing and they will analyze problems of applicability of these theories in Hungarian marketing problem solving. An extremely interesting article was published in Vezetéstudomány (2008/9), written by Zoltán Mitev Ariel and Dóra Horváth, and this article, by its interesting, innovative and constructive aspect has largely influenced authors of present article to continue the path proposed in the abovementioned article. The article, in the same time, is an organic continuation of the earlier common publications of the authors, e.g. the recent article in Marketing & Menedzsment journal.
Resumo:
This study investigated the use of treatment theories and procedures for postural control training used by Occupational Therapists (OTs) when working with hemiplegic adults who have had cerebrovascular accident (CVA) or traumatic brain injury (TBI). The method of data collection was a national survey of 400 randomly selected physical disability OTs with 127 usable surveys returned. Results showed that the most common used treatment theory was neurodevelopmental treatment (NDT), followed by motor relearning program (MRP), proprioceptive neuromuscular facilitation (PNF), Brunnstrom's approach, and the approach of Rood. The most common treatment posture used was sitting, followed by standing, mat activity, equilibrium reaction training, and walking. The factors affecting the use of various treatment theories procedures were years certified, years of clinical experience, work situation and work status. Pearson correlation coefficient analyses found significant positive relationships between treatment theories and postures. There were significant high correlations between usage of all pairs of treatment procedures. ^
Resumo:
We consider SU(3)-equivariant dimensional reduction of Yang Mills theory over certain cyclic orbifolds of the 5-sphere which are Sasaki-Einstein manifolds. We obtain new quiver gauge theories extending those induced via reduction over the leaf spaces of the characteristic foliation of the Sasaki-Einstein structure, which are projective planes. We describe the Higgs branches of these quiver gauge theories as moduli spaces of spherically symmetric instantons which are SU(3)-equivariant solutions to the Hermitian Yang-Mills equations on the associated Calabi-Yau cones, and further compare them to moduli spaces of translationally-invariant instantons on the cones. We provide an explicit unified construction of these moduli spaces as Kahler quotients and show that they have the same cyclic orbifold singularities as the cones over the lens 5-spaces. (C) 2015 The Authors. Published by Elsevier B.V.
Resumo:
Visual recognition is a fundamental research topic in computer vision. This dissertation explores datasets, features, learning, and models used for visual recognition. In order to train visual models and evaluate different recognition algorithms, this dissertation develops an approach to collect object image datasets on web pages using an analysis of text around the image and of image appearance. This method exploits established online knowledge resources (Wikipedia pages for text; Flickr and Caltech data sets for images). The resources provide rich text and object appearance information. This dissertation describes results on two datasets. The first is Berg’s collection of 10 animal categories; on this dataset, we significantly outperform previous approaches. On an additional set of 5 categories, experimental results show the effectiveness of the method. Images are represented as features for visual recognition. This dissertation introduces a text-based image feature and demonstrates that it consistently improves performance on hard object classification problems. The feature is built using an auxiliary dataset of images annotated with tags, downloaded from the Internet. Image tags are noisy. The method obtains the text features of an unannotated image from the tags of its k-nearest neighbors in this auxiliary collection. A visual classifier presented with an object viewed under novel circumstances (say, a new viewing direction) must rely on its visual examples. This text feature may not change, because the auxiliary dataset likely contains a similar picture. While the tags associated with images are noisy, they are more stable when appearance changes. The performance of this feature is tested using PASCAL VOC 2006 and 2007 datasets. This feature performs well; it consistently improves the performance of visual object classifiers, and is particularly effective when the training dataset is small. With more and more collected training data, computational cost becomes a bottleneck, especially when training sophisticated classifiers such as kernelized SVM. This dissertation proposes a fast training algorithm called Stochastic Intersection Kernel Machine (SIKMA). This proposed training method will be useful for many vision problems, as it can produce a kernel classifier that is more accurate than a linear classifier, and can be trained on tens of thousands of examples in two minutes. It processes training examples one by one in a sequence, so memory cost is no longer the bottleneck to process large scale datasets. This dissertation applies this approach to train classifiers of Flickr groups with many group training examples. The resulting Flickr group prediction scores can be used to measure image similarity between two images. Experimental results on the Corel dataset and a PASCAL VOC dataset show the learned Flickr features perform better on image matching, retrieval, and classification than conventional visual features. Visual models are usually trained to best separate positive and negative training examples. However, when recognizing a large number of object categories, there may not be enough training examples for most objects, due to the intrinsic long-tailed distribution of objects in the real world. This dissertation proposes an approach to use comparative object similarity. The key insight is that, given a set of object categories which are similar and a set of categories which are dissimilar, a good object model should respond more strongly to examples from similar categories than to examples from dissimilar categories. This dissertation develops a regularized kernel machine algorithm to use this category dependent similarity regularization. Experiments on hundreds of categories show that our method can make significant improvement for categories with few or even no positive examples.
Resumo:
Background: Evidence-based practice (EBP) is a process through which research is applied in daily clinical practice. Occupational therapists (OTs) and physiotherapists (PTs) are expected to work in line with EBP in order to optimise health care resources. This expectation is too seldom fulfilled. Consequently, research findings may not be implemented in clinical practice in a timely manner, or at all. To remedy this situation, additional knowledge is needed regarding what factors influence the process of EBP among practitioners. The purpose of the present study was to identify factors that influence the use of EBP and the experienced effects of the use of EBP among PTs and OTs in their clinical work. Method: This was a qualitative interview study that consisted of six group interviews involving either OTs or PTs employed by the Jönköping County Council in the South of Sweden. Resulting data were analysed using content analysis. Results: The analysis resulted in the following categories: “definition of evidence and EBP”, “sources of evidence”, “barriers to acquiring evidence and to using evidence in clinical work”, “factors that facilitate the acquisition of evidence and the use of evidence in clinical work”, and “personal experiences of using EBP”. Basing clinical practice on scientific evidence evoked positive experiences, although an ambivalent view towards acting on clinical experience was evident. Participants reported that time for and increased knowledge about searching for, evaluating, and implementing EBP were needed. Conclusion: Because OTs are more oriented towards professional theories and models, and PTs are more focused on randomised controlled trials of interventions, different strategies appear to be needed to increase EBP in these two professions. Management support was considered vital to the implementation of EBP. However, the personal obligation to work in line with EBP must also be emphasised; the participants apparently underestimate its importance.
Resumo:
Due to their intriguing dielectric, pyroelectric, elasto-electric, or opto-electric properties, oxide ferroelectrics are vital candidates for the fabrication of most electronics. However, these extraordinary properties exist mainly in the temperature regime around the ferroelectric phase transition, which is usually several hundreds of K away from room temperature. Therefore, the manipulation of oxide ferroelectrics, especially moving the ferroelectric transition towards room temperature, is of great interest for application and also basic research. In this thesis, we demonstrate this using examples of NaNbO3 films. We show that the transition temperature of these films can be modified via plastic strain caused by epitaxial film growth on a structurally mismatched substrate, and this strain can be fixed by controlling the stoichiometry. The structural and electronic properties of Na1+xNbO3+δ thin films are carefully examined by among others XRD (e.g. RSM) and TEM and cryoelectronic measurements. Especially the electronic features are carefully analyzed via specially developed interdigitated electrodes in combination with integrated temperature sensor and heater. The electronic data are interpreted using existing as well as novel theories and models, they are proved to be closely correlated to the structural characteristics. The major results are: -Na1+xNbO3+δ thin films can be grown epitaxially on (110)NdGaO3 with a thickness up to 140 nm (thicker films have not been studied). Plastic relaxation of the compressive strain sets in when the thickness of the film exceeds approximately 10 – 15 nm. Films with excess Na are mainly composed of NaNbO3 with minor contribution of Na3NbO4. The latter phase seems to form nanoprecipitates that are homogeneously distributed in the NaNbO3 film which helps to stabilize the film and reduce the relaxation of the strain. -For the nominally stoichiometric films, the compressive strain leads to a broad and frequency-dispersive phase transition at lower temperature (125 – 147 K). This could be either a new transition or a shift in temperature of a known transition. Considering the broadness and frequency dispersion of the transition, this is actually a transition from the dielectric state at high temperature to a relaxor-type ferroelectric state at low temperature. The latter is based on the formation of polar nano-regions (PNRs). Using the electric field dependence of the freezing temperature, allows a direct estimation of the volume (70 to 270 nm3) and diameter (5.2 to 8 nm, spherical approximation) of the PNRs. The values confirm with literature values which were measured by other technologies. -In case of the off-stoichiometric samples, we observe again the classical ferroelectric behavior. However, the thermally hysteretic phase transition which is observed around 620 – 660 K for unstrained material is shifted to room temperature due to the compressive strain. Beside to the temperature shift, the temperature dependence of the permittivity is nearly identical for strained and unstrained materials. -The last but not least, in all cases, a significant anisotropy in the electronic and structural properties is observed which arises automatically from the anisotropic strain caused by the orthorhombic structure of the substrate. However, this anisotropy cannot be explained by the classical model which tries to fit an orthorhombic film onto an orthorhombic substrate. A novel “square lattice” model in which the films adapt a “square” shaped lattice in the plane of the film during the epitaxial growth at elevated temperature (~1000 K) nicely explains the experimental results. In this thesis we sketch a way to manipulate the ferroelectricity of NaNbO3 films via strain and stoichiometry. The results indicate that compressive strain which is generated by the epitaxial growth of the film on mismatched substrate is able to reduce the ferroelectric transition temperature or induce a phase transition at low temperature. Moreover, by adding Na in the NaNbO3 film a secondary phase Na3NbO4 is formed which seems to stabilize the main phase NaNbO3 and the strain and, thus, is able to engineer the ferroelectric behavior from the expected classical ferroelectric for perfect stoichiometry to relaxor-type ferroelectric for slightly off-stoichiometry, back to classical ferroelectric for larger off-stoichiometry. Both strain and stoichiometry are proven as perfect methods to optimize the ferroelectric properties of oxide films.
Resumo:
This study focuses on the learning and teaching of Reading in English as a Foreign Language (REFL), in Libya. The study draws on an action research process in which I sought to look critically at students and teachers of English as a Foreign Language (EFL) in Libya as they learned and taught REFL in four Libyan research sites. The Libyan EFL educational system is influenced by two main factors: the method of teaching the Holy-Quran and the long-time ban on teaching EFL by the former Libyan regime under Muammar Gaddafi. Both of these factors have affected the learning and teaching of REFL and I outline these contextual factors in the first chapter of the thesis. This investigation, and the exploration of the challenges that Libyan university students encounter in their REFL, is supported by attention to reading models. These models helped to provide an analytical framework and starting point for understanding the many processes involved in reading for meaning and in reading to satisfy teacher instructions. The theoretical framework I adopted was based, mainly and initially, on top-down, bottom-up, interactive and compensatory interactive models. I drew on these models with a view to understanding whether and how the processes of reading described in the models could be applied to the reading of EFL students and whether these models could help me to better understand what was going on in REFL. The diagnosis stage of the study provided initial data collected from four Libyan research sites with research tools including video-recorded classroom observations, semi-structured interviews with teachers before and after lesson observation, and think-aloud protocols (TAPs) with 24 students (six from each university) in which I examined their REFL reading behaviours and strategies. This stage indicated that the majority of students shared behaviours such as reading aloud, reading each word in the text, articulating the phonemes and syllables of words, or skipping words if they could not pronounce them. Overall this first stage indicated that alternative methods of teaching REFL were needed in order to encourage ‘reading for meaning’ that might be based on strategies related to eventual interactive reading models adapted for REFL. The second phase of this research project was an Intervention Phase involving two team-teaching sessions in one of the four stage one universities. In each session, I worked with the teacher of one group to introduce an alternative method of REFL. This method was based on teaching different reading strategies to encourage the students to work towards an eventual interactive way of reading for meaning. A focus group discussion and TAPs followed the lessons with six students in order to discuss the 'new' method. Next were two video-recorded classroom observations which were followed by an audio-recorded discussion with the teacher about these methods. Finally, I conducted a Skype interview with the class teacher at the end of the semester to discuss any changes he had made in his teaching or had observed in his students' reading with respect to reading behaviour strategies, and reactions and performance of the students as he continued to use the 'new' method. The results of the intervention stage indicate that the teacher, perhaps not surprisingly, can play an important role in adding to students’ knowledge and confidence and in improving their REFL strategies. For example, after the intervention stage, students began to think about the title, and to use their own background knowledge to comprehend the text. The students employed, also, linguistic strategies such as decoding and, above all, the students abandoned the behaviour of reading for pronunciation in favour of reading for meaning. Despite the apparent efficacy of the alternative method, there are, inevitably, limitations related to the small-scale nature of the study and the time I had available to conduct the research. There are challenges, too, related to the students’ first language, the idiosyncrasies of the English language, the teacher training and continuing professional development of teachers, and the continuing political instability of Libya. The students’ lack of vocabulary and their difficulties with grammatical functions such as phrasal and prepositional verbs, forms which do not exist in Arabic, mean that REFL will always be challenging. Given such constraints, the ‘new’ methods I trialled and propose for adoption can only go so far in addressing students’ difficulties in REFL. Overall, the study indicates that the Libyan educational system is underdeveloped and under resourced with respect to REFL. My data indicates that the teacher participants have received little to no professional developmental that could help them improve their teaching in REFL and skills in teaching EFL. These circumstances, along with the perennial problem of large but varying class sizes; student, teacher and assessment expectations; and limited and often poor quality resources, affect the way EFL students learn to read in English. Against this background, the thesis concludes by offering tentative conclusions; reflections on the study, including a discussion of its limitations, and possible recommendations designed to improve REFL learning and teaching in Libyan universities.