926 resultados para Static average-case analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Operating in business-to-business markets requires an in-depth understanding on business networks. Actions and reactions made to compete in markets are fundamentally based on managers‘ subjective perceptions of the network. However, an amalgamation of these individual perceptions, termed a network picture, to a common company level shared understanding on that network, known as network insight, is found to be a substantial challenge for companies. A company‘s capability to enhance common network insight is even argued to lead competitive advantage. Especially companies with value creating logics that require wide comprehension of and collaborating in networks, such as solution business, are necessitated to develop advanced network insight. According to the extant literature, dispersed pieces of atomized network pictures can be unified to a common network insight through a process of amalgamation that comprises barriers/drivers of multilateral exchange, manifold rationality, and recursive time. However, the extant body of literature appears to lack an understanding on the role of internal communication in the development of network insight. Nonetheless, the extant understanding on the amalgamation process indicates that internal communication plays a substantial role in the development of company level network insight. The purpose of the present thesis is to enhance understanding on internal communication in the amalgamation of network pictures to develop network insight in the solution business setting, which was chosen to represent business-to-business value creating logic that emphasizes the capability to understand and utilize networks. Thus, in solution business the role of succeeding in the amalgamation process is expected to emphasize. The study combines qualitative and quantitative research by means of various analytical methods including multiple case analysis, simulation, and social network analysis. Approaching the nascent research topic with differing perspectives and means provides a broader insight on the phenomenon. The study provides empirical evidence from Finnish business-to-business companies which operate globally. The empirical data comprise interviews (n=28) with managers of three case companies. In addition the data includes a questionnaire (n=23) collected mainly for the purpose of social network analysis. In addition, the thesis includes a simulation study more specifically achieved by means of agent based modeling. The findings of the thesis shed light on the role of internal communication in the amalgamation process, contributing to the emergent discussion of network insights and thus to the industrial marketing research. In addition, the thesis increases understanding on internal communication in the change process to solution business, a supplier‘s internal communication in its matrix organization structure during a project sales process, key barriers and drivers that influence internal communication in project sales networks, perceived power within industrial project sales, and the revisioning of network pictures. According to the findings, internal communication is found to play a substantial role in the amalgamation process. First, it is suggested that internal communication is a base of multilateral exchange. Second, it is suggested that internal communication intensifies and maintains manifold rationality. Third, internal communication is needed to explicate the usually differing time perspectives of others and thus it is suggested that internal communication has role as the explicator of recursive time. Furthermore, the role of an efficient amalgamation process is found to be emphasized in solutions business as it requires a more advanced network insight for cross-functional collaboration. Finally, the thesis offers several managerial implications for industrial suppliers to enhance the amalgamation process when operating in solution business.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finnish design has attracted global attention lately and companies within the industry have potential in international markets. Because networks have been found to be extremely helpful in a firm’s international business operations and usefulness of networks is not fully exploited, their role in Finnish design companies is investigated. Accordingly, this study concentrates on understanding the role of networks in the internationalization process of Finnish design companies. This was investigated through describing the internationalization process of Finnish design companies, analyzing what kind of networks are related to internationalization process of Finnish design companies, and analyzing how networks are utilized in the internationalization process of Finnish design companies. The theoretical framework explores the Finnish design industry, internationalization process and networks. The Finnish design industry is introduced in general and the concept of design is defined to refer to the industries of textiles, furniture, clothing, and lighting equipment in the research. The theories of internationalization process, the Uppsala model and Luostarinen’s operation modes, are explored in detail. The Born Global theory, which is a contrary view to stage models, is also discussed. The concept of network is investigated, networks are classified into business and social networks, and network approach to internationalization is discussed. The research is conducted empirically and the research method is a descriptive case study. In this study, four case companies are investigated: the interior decoration unit of L-Fashion Group, Globe Hope, Klo Design, and Melaja Ltd. Data is collected by semi-structured interviews and the analysis is done in the following way: the case companies are introduced, their internationalization processes and networks are described and, finally, the comparison of the case companies is done in a form of cross-case analysis. This research showed that cooperation with social networks, such as locals or employees who have experience from the target market can be extremely helpful in the beginning of a Finnish design company’s internationalization process. This study also indicated that public organizations do not necessarily enhance the internationalization process in a design company point-of-view. In addition, the research showed that there is cooperation between small Finnish design companies whereas large design companies are not as open to cooperation with competitors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The object of this study was to examine foreign operation mode strategies used by Finnish companies in Russia. Thus, it was necessary to understand how Finnish companies have used foreign operation modes and which factors have influenced on their foreign operation mode strategies in Russia. Moreover, the purpose was also to find out that have Finnish companies switched, stretched or combined their foreign operation modes. The study's empirical part was conducted as a semi structured qualitative within-case and cross-case analysis of seven case companies that are selected to represent different industries. There are five Finnish LSEs and two Finnish SMEs as case companies. The results of this study indicated that Finnish companies have mainly used exporting as their initial entry mode to the Russian market. After they had gained understanding and experience of the Russian market, they switched from non-equity and simple foreign operation modes to more challenging and equity demanding foreign operation modes, and established wholly owned operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research was to define content marketing and to discover how content marketing performance can be measured especially on YouTube. Further, the aim was to find out what companies are doing to measure content marketing and what kind of challenges they face in the process. In addition, preferences concerning the measurement were examined. The empirical part was conducted through multiple-case study and cross-case analysis methods. The qualitative data was collected from four large companies in Finnish food and drink industry through semi-structured phone interviews. As a result of this research, a new definition for content marketing was derived. It is suggested that return on objective, or in this case, brand awareness and engagement are used as the main metrics of content marketing performance on YouTube. The major challenge is the nature of the industry, as companies cannot connect the outcome directly to sales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

China’s phenomenal economic growth and social development have brought along interesting opportunities for Finnish companies. One intriguing sector offering significant growth potential is the food industry. Due to the local food safety issues, rising disposable income level and changing consumer habits, the demand for foreign food is increasing. Finnish food companies have much to offer in terms of high quality, food safety in production, technological development and innovation. The purpose of this study is to examine how the Finnish food enterprises choose their entry modes in the Chinese market. This study increases understanding of entry modes the Finnish companies can use to successfully enter the unpredictable market of China in the food industry context. The study examines the industry specific challenges and the possible solutions to them. Qualitative research is selected as research methodology for this study because the intention is to understand the reasons behind the Finnish food enterprises’ entry mode choices in the Chinese market. The study is conducted as a qualitative case analysis. Six Finnish case companies operating in the food industry were interviewed. The results of the research indicate that most of the food industry companies use exporting as their entry mode to China; only one case company used an investment mode. This study illustrates the significance of the factors related to company’s background, mode concerns and Chinese market influences in the entry mode choice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally metacognition has been theorised, methodologically studied and empirically tested from the standpoint mainly of individuals and their learning contexts. In this dissertation the emergence of metacognition is analysed more broadly. The aim of the dissertation was to explore socially shared metacognitive regulation (SSMR) as part of collaborative learning processes taking place in student dyads and small learning groups. The specific aims were to extend the concept of individual metacognition to SSMR, to develop methods to capture and analyse SSMR and to validate the usefulness of the concept of SSMR in two different learning contexts; in face-to-face student dyads solving mathematical word problems and also in small groups taking part in inquiry-based science learning in an asynchronous computer-supported collaborative learning (CSCL) environment. This dissertation is comprised of four studies. In Study I, the main aim was to explore if and how metacognition emerges during problem solving in student dyads and then to develop a method for analysing the social level of awareness, monitoring, and regulatory processes emerging during the problem solving. Two dyads comprised of 10-year-old students who were high-achieving especially in mathematical word problem solving and reading comprehension were involved in the study. An in-depth case analysis was conducted. Data consisted of over 16 (30–45 minutes) videotaped and transcribed face-to-face sessions. The dyads solved altogether 151 mathematical word problems of different difficulty levels in a game-format learning environment. The interaction flowchart was used in the analysis to uncover socially shared metacognition. Interviews (also stimulated recall interviews) were conducted in order to obtain further information about socially shared metacognition. The findings showed the emergence of metacognition in a collaborative learning context in a way that cannot solely be explained by individual conception. The concept of socially-shared metacognition (SSMR) was proposed. The results highlighted the emergence of socially shared metacognition specifically in problems where dyads encountered challenges. Small verbal and nonverbal signals between students also triggered the emergence of socially shared metacognition. Additionally, one dyad implemented a system whereby they shared metacognitive regulation based on their strengths in learning. Overall, the findings suggested that in order to discover patterns of socially shared metacognition, it is important to investigate metacognition over time. However, it was concluded that more research on socially shared metacognition, from larger data sets, is needed. These findings formed the basis of the second study. In Study II, the specific aim was to investigate whether socially shared metacognition can be reliably identified from a large dataset of collaborative face-to-face mathematical word problem solving sessions by student dyads. We specifically examined different difficulty levels of tasks as well as the function and focus of socially shared metacognition. Furthermore, the presence of observable metacognitive experiences at the beginning of socially shared metacognition was explored. Four dyads participated in the study. Each dyad was comprised of high-achieving 10-year-old students, ranked in the top 11% of their fourth grade peers (n=393). Dyads were from the same data set as in Study I. The dyads worked face-to-face in a computer-supported, game-format learning environment. Problem-solving processes for 251 tasks at three difficulty levels taking place during 56 (30–45 minutes) lessons were video-taped and analysed. Baseline data for this study were 14 675 turns of transcribed verbal and nonverbal behaviours observed in four study dyads. The micro-level analysis illustrated how participants moved between different channels of communication (individual and interpersonal). The unit of analysis was a set of turns, referred to as an ‘episode’. The results indicated that socially shared metacognition and its function and focus, as well as the appearance of metacognitive experiences can be defined in a reliable way from a larger data set by independent coders. A comparison of the different difficulty levels of the problems suggested that in order to trigger socially shared metacognition in small groups, the problems should be more difficult, as opposed to moderately difficult or easy. Although socially shared metacognition was found in collaborative face-to-face problem solving among high-achieving student dyads, more research is needed in different contexts. This consideration created the basis of the research on socially shared metacognition in Studies III and IV. In Study III, the aim was to expand the research on SSMR from face-to-face mathematical problem solving in student dyads to inquiry-based science learning among small groups in an asynchronous computer-supported collaborative learning (CSCL) environment. The specific aims were to investigate SSMR’s evolvement and functions in a CSCL environment and to explore how SSMR emerges at different phases of the inquiry process. Finally, individual student participation in SSMR during the process was studied. An in-depth explanatory case study of one small group of four girls aged 12 years was carried out. The girls attended a class that has an entrance examination and conducts a language-enriched curriculum. The small group solved complex science problems in an asynchronous CSCL environment, participating in research-like processes of inquiry during 22 lessons (á 45–minute). Students’ network discussion were recorded in written notes (N=640) which were used as study data. A set of notes, referred to here as a ‘thread’, was used as the unit of analysis. The inter-coder agreement was regarded as substantial. The results indicated that SSMR emerges in a small group’s asynchronous CSCL inquiry process in the science domain. Hence, the results of Study III were in line with the previous Study I and Study II and revealed that metacognition cannot be reduced to the individual level alone. The findings also confirm that SSMR should be examined as a process, since SSMR can evolve during different phases and that different SSMR threads overlapped and intertwined. Although the classification of SSMR’s functions was applicable in the context of CSCL in a small group, the dominant function was different in the asynchronous CSCL inquiry in the small group in a science activity than in mathematical word problem solving among student dyads (Study II). Further, the use of different analytical methods provided complementary findings about students’ participation in SSMR. The findings suggest that it is not enough to code just a single written note or simply to examine who has the largest number of notes in the SSMR thread but also to examine the connections between the notes. As the findings of the present study are based on an in-depth analysis of a single small group, further cases were examined in Study IV, as well as looking at the SSMR’s focus, which was also studied in a face-to-face context. In Study IV, the general aim was to investigate the emergence of SSMR with a larger data set from an asynchronous CSCL inquiry process in small student groups carrying out science activities. The specific aims were to study the emergence of SSMR in the different phases of the process, students’ participation in SSMR, and the relation of SSMR’s focus to the quality of outcomes, which was not explored in previous studies. The participants were 12-year-old students from the same class as in Study III. Five small groups consisting of four students and one of five students (N=25) were involved in the study. The small groups solved ill-defined science problems in an asynchronous CSCL environment, participating in research-like processes of inquiry over a total period of 22 hours. Written notes (N=4088) detailed the network discussions of the small groups and these constituted the study data. With these notes, SSMR threads were explored. As in Study III, the thread was used as the unit of analysis. In total, 332 notes were classified as forming 41 SSMR threads. Inter-coder agreement was assessed by three coders in the different phases of the analysis and found to be reliable. Multiple methods of analysis were used. Results showed that SSMR emerged in all the asynchronous CSCL inquiry processes in the small groups. However, the findings did not reveal any significantly changing trend in the emergence of SSMR during the process. As a main trend, the number of notes included in SSMR threads differed significantly in different phases of the process and small groups differed from each other. Although student participation was seen as highly dispersed between the students, there were differences between students and small groups. Furthermore, the findings indicated that the amount of SSMR during the process or participation structure did not explain the differences in the quality of outcomes for the groups. Rather, when SSMRs were focused on understanding and procedural matters, it was associated with achieving high quality learning outcomes. In turn, when SSMRs were focused on incidental and procedural matters, it was associated with low level learning outcomes. Hence, the findings imply that the focus of any emerging SSMR is crucial to the quality of the learning outcomes. Moreover, the findings encourage the use of multiple research methods for studying SSMR. In total, the four studies convincingly indicate that a phenomenon of socially shared metacognitive regulation also exists. This means that it was possible to define the concept of SSMR theoretically, to investigate it methodologically and to validate it empirically in two different learning contexts across dyads and small groups. In-depth micro-level case analysis in Studies I and III showed the possibility to capture and analyse in detail SSMR during the collaborative process, while in Studies II and IV, the analysis validated the emergence of SSMR in larger data sets. Hence, validation was tested both between two environments and within the same environments with further cases. As a part of this dissertation, SSMR’s detailed functions and foci were revealed. Moreover, the findings showed the important role of observable metacognitive experiences as the starting point of SSMRs. It was apparent that problems dealt with by the groups should be rather difficult if SSMR is to be made clearly visible. Further, individual students’ participation was found to differ between students and groups. The multiple research methods employed revealed supplementary findings regarding SSMR. Finally, when SSMR was focused on understanding and procedural matters, this was seen to lead to higher quality learning outcomes. Socially shared metacognition regulation should therefore be taken into consideration in students’ collaborative learning at school similarly to how an individual’s metacognition is taken into account in individual learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bottom of the pyramid (BoP) markets are an underserved market of approximately four billion people living on under $5 a day in four regional areas: Africa, Asia, Eastern Europe and Latin America. According to estimations, the BoP market forms a $5 trillion global consumer market. Despite the potential of BoP markets, companies have traditionally focused on serving the markets of developed countries and ignored the large customer group at the bottom of the pyramid. The BoP approach as first developed by Prahalad and Hart in 2002 has focused on multinational corporations (MNCs), which were thought of as the ones who should take responsibility in serving the customers at the bottom of the pyramid. This study challenges this proposition and gives evidence that also smaller international new ventures – entrepreneurial firms that are international from their birth, can be successful in BoP markets. BoP markets are characterized by a number of deficiencies in the institutional environment such as strong reliance on informal sector, lack of infrastructure and lack of skilled labor. The purpose of this study is to increase the understanding of international entrepreneurship in BoP markets by analyzing how international new ventures overcome institutional constraints in BoP markets and how institutional uncertainty can be exploited by solving institutional problems. The main objective is divided into four sub objectives. • To describe the opportunities and challenges BoP markets present • To analyze the internationalization of INVs to BoP markets • To examine what kinds of strategies international entrepreneurs use to overcome institutional constraints • To explore the opportunities institutional uncertainty offers for INVs Qualitative approach was used to conduct this study and multiple-case study was chosen as a research strategy in order to allow cross-case analysis. The empirical data was collected through four interviews with the companies Fuzu, Mifuko, Palmroth Consulting and Sibesonke. The results indicated that understanding of the wider institutional environment improves the survival prospects of INVs in BoP markets and that it is indeed possible to exploit institutional uncertainty by solving institutional problems. The main findings were that first-hand experience of the markets and grassroots levels of information are the best assets in internationalization to BoP markets. This study highlights that international entrepreneurs with limited resources can improve the lives of people at the BoP with their business operations and act as small-scale institutional entrepreneurs contributing to the development of the institutional environment of BoP markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo Simulations were carried out using a nearest neighbour ferromagnetic XYmodel, on both 2-D and 3-D quasi-periodic lattices. In the case of 2-D, both the unfrustrated and frustrated XV-model were studied. For the unfrustrated 2-D XV-model, we have examined the magnetization, specific heat, linear susceptibility, helicity modulus and the derivative of the helicity modulus with respect to inverse temperature. The behaviour of all these quatities point to a Kosterlitz-Thouless transition occuring in temperature range Te == (1.0 -1.05) JlkB and with critical exponents that are consistent with previous results (obtained for crystalline lattices) . However, in the frustrated case, analysis of the spin glass susceptibility and EdwardsAnderson order parameter, in addition to the magnetization, specific heat and linear susceptibility, support a spin glass transition. In the case where the 'thin' rhombus is fully frustrated, a freezing transition occurs at Tf == 0.137 JlkB , which contradicts previous work suggesting the critical dimension of spin glasses to be de > 2 . In the 3-D systems, examination of the magnetization, specific heat and linear susceptibility reveal a conventional second order phase transition. Through a cumulant analysis and finite size scaling, a critical temperature of Te == (2.292 ± 0.003) JI kB and critical exponents of 0:' == 0.03 ± 0.03, f3 == 0.30 ± 0.01 and I == 1.31 ± 0.02 have been obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La légitimité d’une organisation est fondée sur sa mission, c’est-à-dire sur sa raison d’être. Des responsables des bibliothèques et de nombreux chercheurs craignent que la légitimité des bibliothèques publiques soit contestée dans la société de l’information. De plus, les textes officiels présentant les missions des bibliothèques publiques sont divers et les missions y sont délibérément non définies. Au Québec, où une grande majorité des bibliothèques publiques autonomes sont placées directement sous la tutelle des municipalités, les bibliothèques publiques doivent définir et légitimer leurs missions avec les élus municipaux. L’objectif principal de cette recherche est de comprendre, via les discours, le point de vue des élus municipaux québécois sur les missions des bibliothèques publiques autonomes, en comparaison avec les pratiques et les ressources des bibliothèques au plan local. Basé sur la théorie de la construction sociale de la réalité, un cadre conceptuel est proposé de manière à étudier non seulement les discours dans leur dimension textuelle, mais aussi à contextualiser ces discours et analyser l’écart entre ces discours et les pratiques des bibliothèques.La stratégie de recherche adoptée est une étude de cas multiples. L’objectif est de développer une analyse en profondeur de chaque cas et une analyse inter cas. Les douze cas (municipalités) ont été sélectionnés en fonction de deux critères de variation (la taille de la municipalité et le budget annuel alloué par la municipalité à la bibliothèque) et un critère discriminant (la distance par rapport à l’Université de Montréal). Des entrevues ont été menées auprès des élus municipaux présidant la commission ou le comité dont dépendent les bibliothèques publiques. Ces entrevues et les politiques culturelles ont fait l’objet d’une analyse de discours. Les entrevues auprès des responsables des bibliothèques et la documentation ont fait l’objet d’une analyse de contenu. Ces analyses ont permis la triangulation des méthodes et des sources de données.Les élus municipaux québécois, comme les professionnels, n’offrent pas un discours homogène sur les missions des bibliothèques publiques. Toutefois, un modèle de discours émerge. Il montre un discours « limité » par rapport à la littérature, dans lequel une image passive de la bibliothèque est présentée et dans lequel la tradition perdure malgré le contexte de la société de l’information. Mais l’analyse révèle aussi que les élus municipaux construisent leurs points de vue sur leurs propres convictions en tant qu’individus, sur leur rôle dans la gestion de la municipalité en tant qu’élus et sur l’image qu’ils ont des usagers des bibliothèques publiques. Enfin, l’analyse a révélé un axe de différenciation des points de vue selon que le discours s’appuie sur des valeurs fondamentales ou sur les usages (réels ou supposés) de la bibliothèque.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les données manquantes sont fréquentes dans les enquêtes et peuvent entraîner d’importantes erreurs d’estimation de paramètres. Ce mémoire méthodologique en sociologie porte sur l’influence des données manquantes sur l’estimation de l’effet d’un programme de prévention. Les deux premières sections exposent les possibilités de biais engendrées par les données manquantes et présentent les approches théoriques permettant de les décrire. La troisième section porte sur les méthodes de traitement des données manquantes. Les méthodes classiques sont décrites ainsi que trois méthodes récentes. La quatrième section contient une présentation de l’Enquête longitudinale et expérimentale de Montréal (ELEM) et une description des données utilisées. La cinquième expose les analyses effectuées, elle contient : la méthode d’analyse de l’effet d’une intervention à partir de données longitudinales, une description approfondie des données manquantes de l’ELEM ainsi qu’un diagnostic des schémas et du mécanisme. La sixième section contient les résultats de l’estimation de l’effet du programme selon différents postulats concernant le mécanisme des données manquantes et selon quatre méthodes : l’analyse des cas complets, le maximum de vraisemblance, la pondération et l’imputation multiple. Ils indiquent (I) que le postulat sur le type de mécanisme MAR des données manquantes semble influencer l’estimation de l’effet du programme et que (II) les estimations obtenues par différentes méthodes d’estimation mènent à des conclusions similaires sur l’effet de l’intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le Canada consacre chaque année des milliards en aide internationale. Selon le Ministère des affaires étrangères, commerce et développement, l’aide déployée en 2013 s’est chiffrée à plus de 5,48 milliards de dollars. Dans chaque projet mis en œuvre dans les pays en développement, des ressources humaines donnent de leur temps et s’efforcent de contribuer au renforcement des capacités des organisations locales. Ces projets sont des initiatives de coopération technique ou renferment des composantes de coopération technique; les personnes qui y sont affectées doivent accomplir de multiples tâches, dont celle d’agent de partage de connaissances. Cette thèse explore ce phénomène en apportant un éclairage sur les processus relationnels sous-jacents aux échanges entre les personnes liées à ces initiatives, soient les conseillers volontaires expatriés et les membres des équipes locales qui accueillent de telles initiatives. Elle tend à appuyer l’influence marquée des relations interpersonnelles sur les résultats de partage de connaissances, sauf que la confiance, à elle seule, ne suffit pas pour atteindre des objectifs de développement durable. L’analyse des cas, s’appuyant principalement sur des entrevues semi-dirigées effectuées à Haïti et au Sénégal, nous permet d’affirmer l’importance de s’attarder à la capacité d’assimilation dynamique des parties au partage, mais également aux rôles des gestionnaires des organismes partenaires locaux dans leur engagement à réaliser des mandats visant le partage de connaissances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bank switching in embedded processors having partitioned memory architecture results in code size as well as run time overhead. An algorithm and its application to assist the compiler in eliminating the redundant bank switching codes introduced and deciding the optimum data allocation to banked memory is presented in this work. A relation matrix formed for the memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Data allocation to memory is done by considering all possible permutation of memory banks and combination of data. The compiler output corresponding to each data mapping scheme is subjected to a static machine code analysis which identifies the one with minimum number of bank switching codes. Even though the method is compiler independent, the algorithm utilizes certain architectural features of the target processor. A prototype based on PIC 16F87X microcontrollers is described. This method scales well into larger number of memory blocks and other architectures so that high performance compilers can integrate this technique for efficient code generation. The technique is illustrated with an example

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the impact of integrating interventions like nutrition gardening, livestock rearing, product diversification and allied income generation activities in small and marginal coconut homesteads along with nutrition education in improving the food and nutritional security as well as the income of the family members. The activities were carried out through registered Community Based Organizations (CBOs) in three locations in Kerala, India during 2005-2008. Data was collected before and after the project periods through interviews using a pre-tested questionnaire containing statements indicating the adequacy, quality and diversity of food materials. Fifty respondents each were randomly selected from the three communities, thereby resulting in a total sample size of 150. The data was analysed using SPSS by adopting statistical tools like frequency, average, percentage analysis, t – test and regression. Participatory planning and implementation of diverse interventions notably intercropping and off-farm activities along with nutrition education brought out significant improvements in the food and nutritional security, in terms of frequency and quantity of consumption as well as diet diversity. At the end of the project, 96%of the members became completely food secure and 72% nutritionally secure. The overall consumption of fruits, vegetables and milk by both children and adults and egg by children recorded increase over the project period. Consumption of fish was more than the Recommended Dietary Intake (RDI) level during pre and post project periods. Project interventions like nutrition gardening could bring in surplus consumption of vegetables (35%) and fruits (10%) than RDI. In spite of the increased consumption of green leafy vegetables and milk and milk products over the project period, the levels of consumption were still below the RDI levels. CBO-wise analysis of the consumption patterns revealed the need for location-specific interventions matching to the needs and preferences of the communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents there important results in visual object recognition based on shape. (1) A new algorithm (RAST; Recognition by Adaptive Sudivisions of Tranformation space) is presented that has lower average-case complexity than any known recognition algorithm. (2) It is shown, both theoretically and empirically, that representing 3D objects as collections of 2D views (the "View-Based Approximation") is feasible and affects the reliability of 3D recognition systems no more than other commonly made approximations. (3) The problem of recognition in cluttered scenes is considered from a Bayesian perspective; the commonly-used "bounded-error errorsmeasure" is demonstrated to correspond to an independence assumption. It is shown that by modeling the statistical properties of real-scenes better, objects can be recognized more reliably.