15 resultados para ayers of formal neurons, separability principles

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this thesis was togenerate better filtration technologies for effective production of pure starchproducts, and thereby the optimisation of filtration sequences using created models, as well as the synthesis of the theories of different filtration stages, which were suitable for starches. At first, the structure and the characteristics of the different starch grades are introduced and each starch grade is shown to have special characteristics. These are taken as the basis of the understanding of the differences in the behaviour of the different native starch grades and their modifications in pressure filtration. Next, the pressure filtration process is divided into stages, which are filtration, cake washing, compression dewatering and displacement dewatering. Each stage is considered individually in their own chapters. The order of the different suitable combinations of the process stages are studied, as well as the proper durations and pressures of the stages. The principles of the theory of each stageare reviewed, the methods for monitoring the progress of each stage are presented, and finally, the modelling of them is introduced. The experimental results obtained from the different stages of starch filtration tests are given and the suitability of the theories and models to the starch filtration are shown. Finally, the theories and the models are gathered together and shown, that the analysis of the whole starch pressure filtration process can be performed with the software developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most important knowledge in firms is mostly tacit and embedded in individuals within the organization. This background knowledge that firms possess is used for creation of new knowledge and innovations. As firms today greatly concentrate on their core competencies, they need external knowledge from various collaboration partners. Thus, collaborative relationship governance, as well as control (use of appropriability mechanisms) over background (the input from each firm in innovative activities) and foreground knowledge (the output of collaboration activities) is needed in order to successfully create and capture value from innovative activities without losing core knowledge and competitiveness. Even though research has concentrated on knowledge protection and knowledge sharing, studies that combine both of these views and examine the effects of sharing and protection on value creation and capture have been rather limited. Studies have mainly focused on the protection of the output of innovation while forgetting the protection of the input of innovation. On the other hand, as the research concentrating on the output of innovation tends to favor formal mechanisms, informal mechanisms have remained more unknown to researchers as well as managers. This research aims to combine the perspectives of knowledge sharing and knowledge protection and their relationship with value creation and value capture. The sharing and protection are viewed from two points of view: the use of appropriability mechanisms, as well as governance of the collaborative relationship. The study consists of two parts. The first part introduces the research topic and discusses the overall results. The second part comprises six complementary research publications. Both qualitative and quantitative research methods are used in the study. In terms of results, the findings enhance understanding of the combined use of formal and informal mechanisms for knowledge protection and sharing. Informal mechanisms appear to be emphasized in the protection of background knowledge, and thus are prerequisites for innovation, whereas formal mechanisms are relied on more for protecting the results of innovative activities. However, the simultaneous use of the formal and informal mechanisms that are relevant to the particular industry and innovation context is recommendedthroughout the collaborative innovation process. Further, the study adds to the current knowledge on HRM as an appropriability mechanism: on the firm level its uses include assessing and hedging against employee-related risks such as knowledge leaking and knowledge leaving. A further contribution is to the research on HRM protection and its interrelations with other appropriability mechanisms, its constituents, and its potential use in the area of knowledge protection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis explores global and national-level issues related to the development of markets for biomass for energy. The thesis consists of five separate papers and provides insights on selected issues. The aim of Paper I was to identify methodological and statistical challenges in assessing international solid and liquid biofuels trade and provide an overview of the Finnish situation with respect to the status of international solid and liquid biofuels trade. We found that, for the Finnish case, it is possible to qualify direct and indirect trade volumes of biofuels. The study showed that indirect trade of biofuels has a highly significant role in Finland and may be a significant sector also in global biofuels trade. The purpose of Paper II was to provide a quantified insight into Finnish prospects for meeting the national 2020 renewable energy targets and concurrently becoming a largescale producer of forest-biomass-based second-generation biofuels for feeding increasing demand in European markets. We found that Finland has good opportunities to realise a scenario to meet 2020 renewable energy targets and for large-scale production of wood-based biofuels. The potential net export of transport biofuels from Finland in 2020 would correspond to 2–3% of European demand. Paper III summarises the global status of international solid and liquid biofuels trade as illuminated by several separate sources. International trade of biofuels was estimated at nearly 1 EJ for 2006. Indirect trade of biofuels through trading of industrial roundwood and material by-products comprises the largest proportion of the trading, with a share of about two thirds. The purpose of Paper IV was to outline a comprehensive picture of the coverage of various certification schemes and sustainability principles relating to the entire value-added chain of biomass and bioenergy. Regardless of the intensive work that has been done in the field of sustainability schemes and principles concerning use of biomass for energy, weaknesses still exist. The objective of Paper V was to clarify the alternative scenarios for the international biomass market until 2020 and identify the underlying steps needed toward a wellfunctioning and sustainable market for biomass for energy purposes. An overall conclusion drawn from this analysis concerns the enormous opportunities related to the utilisation of biomass for energy in the coming decades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We live in an age where rationalization and demands of efficiency taint every aspect of our lives both as individuals and as a society. Even warfare cannot escape the increased speed of human interaction. Time is a resource to be managed. It has to be optimized, saved and won in military affairs as well. The purpose of this research paper is to analyze the dogmatic texts of military thought to search for answers what the classics of strategy saw in the interrelations of temporality and warfare and if their thoughts remain meaningful in the contemporary conjunction. Since the way a society functions is reflected in the way it conducts its wars, there naturally are differences between an agrarian, industrial and information society. Theorists of different eras emphasize things specific to their times, but warfare, like any human interaction, is always bounded by temporality. Not only is the pace of warfare dependent on the progress of the society, but time permeates warfare in all its aspects. This research paper focuses on two specific topics that arose from the texts themselves; how should time be managed and manipulated in warfare and how to economize and “win” it from the enemy. A method where lengthy quotations are used to illustrate the main point of the strategists has been chosen for this research paper. While Clausewitz is the most prominent source of quotations, thoughts from ancient India and China are represented as well to prove that the combination of right force in the right place at the right time is still the way of the victorious. Tactics change in the course of time but the principles of strategy remain unaltered and are only adapted to suit new situations. While ancient and pre-modern societies had their focus on finding auspicious moments for battle in the flow of kronos-time based on divinities, portents and auguries, we can trace elements of manipulation of time in warfare from the earliest surviving texts. While time as a fourth dimension of the battlespace emerged only in the modern era, all through the history of military thought it has had a profound meaning. In the past time could be squandered, today it always has to be won. This paper asks the question “why”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this doctoral thesis is to widen and develop our theoretical frameworks for discussion and analyses of feedback practices in management accounting, particularly shedding light on its formal and informal aspects. The concept of feedback in management accounting has conventionally been analyzed within cybernetic control theory, in which feedback flows as a diagnostic or comparative loop between measurable outputs and pre-set goals (see e.g. Flamholtz et al. 1985; Flamholtz 1996, 1983), i.e. as a formal feedback loop. However, the everyday feedback practices in organizations are combinations of formal and informal elements. In addition to technique-driven feedback approaches (like budgets, measurement, and reward systems) we could also categorize social feedback practices that managers see relevant and effective in the pursuit of organizational control. While cybernetics or control theories successfully capture rational and measured aspects of organizational performance and offer a broad organizational context for the analysis, many individual and informal aspects remain vague and isolated. In order to discuss and make sense of the heterogeneous field of interpretations of formal and informal feedback, both in theory and practice, dichotomous approaches seem to be insufficient. Therefore, I suggest an analytical framework of formal and informal feedback with three dimensions (3D’s): source, time, and rule. Based on an abductive analysis of the theoretical and empirical findings from an interpretive case study around a business unit called Division Steelco, the 3Dframework and formal and informal feedback practices are further elaborated vis-á-vis the four thematic layers in the organizational control model by Flamholtz et al. (1985; Flamholtz 1996, 1983): core control system, organizational structure, organizational culture, and external environment. Various personal and cultural meanings given to the formal and informal feedback practices (“feedback as something”) create multidimensional interpretative contexts. Multidimensional frameworks aim to capture and better understand both the variety of interpretations and their implications to the functionality of feedback practices, important in interpretive research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today's networked systems are becoming increasingly complex and diverse. The current simulation and runtime verification techniques do not provide support for developing such systems efficiently; moreover, the reliability of the simulated/verified systems is not thoroughly ensured. To address these challenges, the use of formal techniques to reason about network system development is growing, while at the same time, the mathematical background necessary for using formal techniques is a barrier for network designers to efficiently employ them. Thus, these techniques are not vastly used for developing networked systems. The objective of this thesis is to propose formal approaches for the development of reliable networked systems, by taking efficiency into account. With respect to reliability, we propose the architectural development of correct-by-construction networked system models. With respect to efficiency, we propose reusable network architectures as well as network development. At the core of our development methodology, we employ the abstraction and refinement techniques for the development and analysis of networked systems. We evaluate our proposal by employing the proposed architectures to a pervasive class of dynamic networks, i.e., wireless sensor network architectures as well as to a pervasive class of static networks, i.e., network-on-chip architectures. The ultimate goal of our research is to put forward the idea of building libraries of pre-proved rules for the efficient modelling, development, and analysis of networked systems. We take into account both qualitative and quantitative analysis of networks via varied formal tool support, using a theorem prover the Rodin platform and a statistical model checker the SMC-Uppaal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

TRIZ is one of the well-known tools, based on analytical methods for creative problem solving. This thesis suggests adapted version of contradiction matrix, a powerful tool of TRIZ and few principles based on concept of original TRIZ. It is believed that the proposed version would aid in problem solving, especially those encountered in chemical process industries with unit operations. In addition, this thesis would help fresh process engineers to recognize importance of various available methods for creative problem solving and learn TRIZ method of creative problem solving. This thesis work mainly provides idea on how to modify TRIZ based method according to ones requirements to fit in particular niche area and solve problems efficiently in creative way. Here in this case, the contradiction matrix developed is based on review of common problems encountered in chemical process industry, particularly in unit operations and resolutions are based on approaches used in past to handle those issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This applied linguistic study in the field of second language acquisition investigated the assessment practices of class teachers as well as the challenges and visions of language assessment in bilingual content instruction (CLIL) at primary level in Finnish basic education. Furthermore, pupils’ and their parents’ perceptions of language assessment and LangPerform computer simulations as an alternative, modern assessment method in CLIL contexts were examined. The study was conducted for descriptive and developmental purposes in three phases: 1) a CLIL assessment survey; 2) simulation 1; and 3) simulation 2. All phases had a varying number of participants. The population of this mixed methods study were CLIL class teachers, their pupils and the pupils’ parents. The sampling was multi-staged and based on probability and random sampling. The data were triangulated. Altogether 42 CLIL class teachers nationwide, 109 pupils from the 3rd, 4th and 5th grade as well as 99 parents from two research schools in South-Western Finland participated in the CLIL assessment survey followed by an audio-recorded theme interview of volunteers (10 teachers, 20 pupils and 7 parents). The simulation experimentations 1 and 2 produced 146 pupil and 39 parental questionnaires as well as video interviews of volunteered pupils. The data were analysed both quantitatively using percentages and numerical frequencies and qualitatively employing thematic content analysis. Based on the data, language assessment in primary CLIL is not an established practice. It largely appears to be infrequent, incidental, implicit and based on impressions rather than evidence or the curriculum. The most used assessment methods were teacher observation, bilingual tests and dialogic interaction, and the least used were portfolios, simulations and peer assessment. Although language assessment was generally perceived as important by teachers, a fifth of them did not gather assessment information systematically, and 38% scarcely gave linguistic feedback to pupils. Both pupils and parents wished to receive more information on CLIL language issues; 91% of pupils claimed to receive feedback rarely or occasionally, and 63% of them wished to get more information on their linguistic coping in CLIL subjects. Of the parents, 76% wished to receive more information on the English proficiency of their children and their linguistic development. This may be a response to indirect feedback practices identified in this study. There are several challenges related to assessment; the most notable is the lack of a CLIL curriculum, language objectives and common ground principles of assessment. Three diverse approaches to language in CLIL that appear to affect teachers’ views on language assessment were identified: instrumental (language as a tool), dual (language as a tool and object of learning) and eclectic (miscellaneous views, e.g. affective factors prioritised). LangPerform computer simulations seem to be perceived as an appropriate alternative assessment method in CLIL. It is strongly recommended that the fundamentals for assessment (curricula and language objectives) and a mutual assessment scheme should be determined and stakeholders’ knowledge base of CLIL strengthened. The principles of adequate assessment in primary CLIL are identified as well as several appropriate assessment methods suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Initially identified as stress activated protein kinases (SAPKs), the c-Jun Nterminal kinases (JNKs) are currently accepted as potent regulators of various physiologically important cellular events. Named after their competence to phosphorylate transcription factor c-Jun in response to UVtreatment, JNKs play a key role in cell proliferation, cell death or cell migration. Interestingly, these functions are crucial for proper brain formation. The family consists of three JNK isoforms, JNK1, JNK2 and JNK3. Unlike brain specific JNK3 isoform, JNK1 and JNK2 are ubiquitously expressed. It is estimated that ten splice variants exist. However, the detailed cellular functions of these remain undetermined. In addition, physiological conditions keep the activities of JNK2 and JNK3 low in comparison with JNK1, whereas cellular stress raises the activity of these isoforms dramatically. Importantly, JNK1 activity is constitutively high in neurons, yet it does not stimulate cell death. This suggests a valuable role for JNK1 in brain development, but also as an important mediator of cell wellbeing. The aim of this thesis was to characterize the functional relationship between JNK1 and SCG10. We found that SCG10 is a bona fide target for JNK. By employing differential centrifugation we showed that SCG10 co-localized with active JNK, MKK7 and JIP1 in a fraction containing endosomes and Golgi vesicles. Investigation of JNK knockout tissues using phosphospecific antibodies recognizing JNK-specific phosphorylation sites on SCG10 (Ser 62/Ser 73) showed that phosphorylation of endogenous SCG10 was dramatically decreased in Jnk1-/- brains. Moreover, we found that JNK and SCG10 co-express during early embryonic days in brain regions that undergo extensive neuronal migration. Our study revealed that selective inhibition of JNK in the cytoplasm significantly increased both the frequency of exit from the multipolar stage and radial migration rate. However, as a consequence, it led to ill-defined cellular organization. Furthermore, we found that multipolar exit and radial migration in Jnk1 deficient mice can be connected to changes in phosphorylation state of SCG10. Also, the expression of a pseudo-phosphorylated mutant form of SCG10, mimicking the JNK1- phopshorylated form, brings migration rate back to normal in Jnk1 knockout mouse embryos. Furthermore, we investigated the role of SCG10 and JNK in regulation of Golgi apparatus (GA) biogenesis and whether pathological JNK action could be discernible by its deregulation. We found that SCG10 maintains GA integrity as with the absence of SCG10 neurons present more compact fragmented GA structure, as shown by the knockdown approach. Interestingly, neurons isolated from Jnk1-/- mice show similar characteristics. Block of ER to GA is believed to be involved in development of Parkinson's disease. Hence, by using a pharmacological approach (Brefeldin A treatment), we showed that GA recovery is delayed upon removal of the drug in Jnk1-/- neurons to an extent similar to the shRNA SCG10-treated cells. Finally, we investigated the role of the JNK1-SCG10 duo in the maintenance of GA biogenesis following excitotoxic insult. Although the GA underwent fragmentation in response to NMDA treatment, we observed a substantial delay in GA disintegration in neurons lacking either JNK1 or SCG10.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have investigated Russian children’s reading acquisition during an intermediate period in their development: after literacy onset, but before they have acquired well-developed decoding skills. The results of our study suggest that Russian first graders rely primarily on phonemes and syllables as reading grain-size units. Phonemic awareness seems to have reached the metalinguistic level more rapidly than syllabic awareness after the onset of reading instruction, the reversal which is typical for the initial stages of formal reading instruction creating external demand for phonemic awareness. Another reason might be the inherent instability of syllabic boundaries in Russian. We have shown that body-coda is a more natural representation of subsyllabic structure in Russian than onset-rime. We also found that Russian children displayed variability of syllable onset and offset decisions which can be attributed to the lack of congruence between syllabic and morphemic word division in Russian. We suggest that fuzziness of syllable boundary decisions is a sign of the transitional nature of this stage in the reading development and it indicates progress towards an awareness of morphologically determined closed syllables. Our study also showed that orthographic complexity exerts an influence on reading in Russian from the very start of reading acquisition. Besides, we found that Russian first graders experience fluency difficulties in reading orthographically simple words and nonwords of two and more syllables. The transition from monosyllabic to bisyllabic lexical items constitutes a certain threshold, for which the syllabic structure seemed to be of no difference. When we compared the outcomes of the Russian children with the ones produced by speakers of other languages, we discovered that in the tasks which could be performed with the help of alphabetic recoding Russian children’s accuracy was comparable to that of children learning to read in relatively shallow orthographies. In tasks where this approach works only partially, Russian children demonstrated accuracy results similar to those in deeper orthographies. This pattern of moderate results in accuracy and excellent performance in terms of reaction times is an indication that children apply phonological recoding as their dominant strategy to various reading tasks and are only beginning to develop suitable multiple strategies in dealing with orthographically complex material. The development of these strategies is not completed during Grade 1 and the shift towards diversification of strategies apparently continues in Grade 2.