918 resultados para Quasi-Normalité
Resumo:
Although internet chat is a significant aspect of many internet users’ lives, the manner in which participants in quasi-synchronous chat situations orient to issues of social and moral order remains to be studied in depth. The research presented here is therefore at the forefront of a continually developing area of study. This work contributes new insights into how members construct and make accountable the social and moral orders of an adult-oriented Internet Relay Chat (IRC) channel by addressing three questions: (1) What conversational resources do participants use in addressing matters of social and moral order? (2) How are these conversational resources deployed within IRC interaction? and (3) What interactional work is locally accomplished through use of these resources? A survey of the literature reveals considerable research in the field of computer-mediated communication, exploring both asynchronous and quasi-synchronous discussion forums. The research discussed represents a range of communication interests including group and collaborative interaction, the linguistic construction of social identity, and the linguistic features of online interaction. It is suggested that the present research differs from previous studies in three ways: (1) it focuses on the interaction itself, rather than the ways in which the medium affects the interaction; (2) it offers turn-by-turn analysis of interaction in situ; and (3) it discusses membership categories only insofar as they are shown to be relevant by participants through their talk. Through consideration of the literature, the present study is firmly situated within the broader computer-mediated communication field. Ethnomethodology, conversation analysis and membership categorization analysis were adopted as appropriate methodological approaches to explore the research focus on interaction in situ, and in particular to investigate the ways in which participants negotiate and co-construct social and moral orders in the course of their interaction. IRC logs collected from one chat room were analysed using a two-pass method, based on a modification of the approaches proposed by Pomerantz and Fehr (1997) and ten Have (1999). From this detailed examination of the data corpus three interaction topics are identified by means of which participants clearly orient to issues of social and moral order: challenges to rule violations, ‘trolling’ for cybersex, and experiences regarding the 9/11 attacks. Instances of these interactional topics are subjected to fine-grained analysis, to demonstrate the ways in which participants draw upon various interactional resources in their negotiation and construction of channel social and moral orders. While these analytical topics stand alone in individual focus, together they illustrate different instances in which participants’ talk serves to negotiate social and moral orders or collaboratively construct new orders. Building on the work of Vallis (2001), Chapter 5 illustrates three ways that rule violation is initiated as a channel discussion topic: (1) through a visible violation in open channel, (2) through an official warning or sanction by a channel operator regarding the violation, and (3) through a complaint or announcement of a rule violation by a non-channel operator participant. Once the topic has been initiated, it is shown to become available as a topic for others, including the perceived violator. The fine-grained analysis of challenges to rule violations ultimately demonstrates that channel participants orient to the rules as a resource in developing categorizations of both the rule violation and violator. These categorizations are contextual in that they are locally based and understood within specific contexts and practices. Thus, it is shown that compliance with rules and an orientation to rule violations as inappropriate within the social and moral orders of the channel serves two purposes: (1) to orient the speaker as a group member, and (2) to reinforce the social and moral orders of the group. Chapter 6 explores a particular type of rule violation, solicitations for ‘cybersex’ known in IRC parlance as ‘trolling’. In responding to trolling violations participants are demonstrated to use affiliative and aggressive humour, in particular irony, sarcasm and insults. These conversational resources perform solidarity building within the group, positioning non-Troll respondents as compliant group members. This solidarity work is shown to have three outcomes: (1) consensus building, (2) collaborative construction of group membership, and (3) the continued construction and negotiation of existing social and moral orders. Chapter 7, the final data analysis chapter, offers insight into how participants, in discussing the events of 9/11 on the actual day, collaboratively constructed new social and moral orders, while orienting to issues of appropriate and reasonable emotional responses. This analysis demonstrates how participants go about ‘doing being ordinary’ (Sacks, 1992b) in formulating their ‘first thoughts’ (Jefferson, 2004). Through sharing their initial impressions of the event, participants perform support work within the interaction, in essence working to normalize both the event and their initial misinterpretation of it. Normalising as a support work mechanism is also shown in relation to participants constructing the ‘quiet’ following the event as unusual. Normalising is accomplished by reference to the indexical ‘it’ and location formulations, which participants use both to negotiate who can claim to experience the ‘unnatural quiet’ and to identify the extent of the quiet. Through their talk participants upgrade the quiet from something legitimately experienced by one person in a particular place to something that could be experienced ‘anywhere’, moving the phenomenon from local to global provenance. With its methodological design and detailed analysis and findings, this research contributes to existing knowledge in four ways. First, it shows how rules are used by participants as a resource in negotiating and constructing social and moral orders. Second, it demonstrates that irony, sarcasm and insults are three devices of humour which can be used to perform solidarity work and reinforce existing social and moral orders. Third, it demonstrates how new social and moral orders are collaboratively constructed in relation to extraordinary events, which serve to frame the event and evoke reasonable responses for participants. And last, the detailed analysis and findings further support the use of conversation analysis and membership categorization as valuable methods for approaching quasi-synchronous computer-mediated communication.
Resumo:
Perspectives on work-life balance (WLB) reflected in political, media and organisational discourse, would maintain that WLB is on the agenda because of broad social, economic and political factors (Fleetwood 2007). In contrast, critical scholarship which examines work-life balance (WLB) and its associated practices maintains that workplace flexibility is more than a quasi-functionalist response to contemporary problems faced by individuals, families or organisations. For example, the literature identifies where flexible work arrangements have not lived up to expectations of a panacea for work-home conflicts, being characterised as much by employer-driven working conditions that disadvantage workers and constrain balance, as they are by employee friendly practices that enable it (Charlesworth 1997). Further, even where generous organisational work-life balance policies exist, under-utilisation is an issue (Schaefer et al, 2007). Compounding these issues is that many employees perceive their paid work as becoming more intense, pressured and demanding (Townsend et al 2003).
Resumo:
Damage localization induced by strain softening can be predicted by the direct minimization of a global energy function. This article concerns the computational strategy for implementing this principle for softening materials such as concrete. Instead of using heuristic global optimization techniques, our strategies are a hybrid of local optimization methods with a path-finding approach to ensure a global optimum. With admissible nodal displacements being independent variables, it is easy to deal with the geometric (mesh) constraint conditions. The direct search optimization methods recover the localized solutions for a range of softening lattice models which are representative of quasi-brittle structures
Resumo:
Aim. This paper is a report of the effectiveness of a purpose-designed education program in improving undergraduate nursing students’ understanding and practice of infection control precautions. Background. The severe acute respiratory syndrome outbreak in 2003 highlighted that healthcare workers were under-prepared for such an epidemic. While many in-service education sessions were arranged by institutions in response to the outbreak, preservice nursing education has overlooked preparation for handling such infectious disease epidemics. Method. A quasi-experimental design was used and a 16-hour, purpose-designed infection control education programme was implemented for preservice nursing students in southern Taiwan. Self-administered questionnaires were distributed at three time points during the period September 2005 to April 2006 to examine the sustainability and effectiveness of the intervention. Results. A total of 175 preservice nursing students participated in the study. Following the education programme, students in the intervention group showed a statistically significant improvement across time in their knowledge of these precautions [F(2, 180) = 13Æ53, P < 0Æ001] and confidence in resolving infectionrelated issues [F(1Æ79, 168Æ95) = 3Æ24] when compared with those in the control group. Conclusion. To improve nursing students’ capacity in responding to infectious epidemics, an educational programme that integrates the theme of infection precautions, learning theory and teaching strategies is recommended for all nursing institutes.
Resumo:
Background: Blood for transfusion may become contaminated at any point between collection and transfusion and may result in bacteraemia (the presence of bacteria in the blood),severe illness or even death for the blood recipient. Donor arm skin is one potential source of blood contamination, so it is usual to cleanse the skin with an antiseptic before blood donation. One-step and two-step alcohol based antiseptic regimens are both commonly advocated but there is uncertainty as to which is most effective.----- Objectives: To assess the effects of cleansing the skin of blood donors with alcohol in a one-step compared with alcohol in a two-step procedure to prevent contamination of collected blood or bacteraemia in the recipient.----- Search strategy: We searched the Cochrane Wounds Group Specialised Register (March 10 2009); The Cochrane Central Register of Controlled Trials(CENTRAL) The Cochrane Library 2009, Issue 1; Ovid MEDLINE - (1950 to February Week 4 2009); Ovid EMBASE - (1980 to 2009 Week 9); and EBSCO CINAHL - (1982 to February Week 4 2009). We also searched the reference lists of key papers.----- Selection criteria: All randomised trials (RCTs) comparing alcohol based donor skin cleansing in a one-step versus a two-step process that includes alcohol and any other antiseptic for pre-venepuncture skin cleansing were considered. Quasi randomised trials were to have been considered in the absence of RCTs.----- Data collection and analysis: Two review authors independently assessed studies for inclusion.----- Main results: No studies (RCTs or quasi RCTs) met the inclusion criteria. Authors’ conclusions We did not identify any eligible studies for inclusion in this review. It is therefore unclear whether a two-step, alcohol followed by antiseptic skin cleansing process prior to blood donation confers any reduction in the risk of blood contamination or bacteraemia in blood recipients, or conversely whether a one-step process increases risk above that associated with a two-step process.
Resumo:
1. Ecological data sets often use clustered measurements or use repeated sampling in a longitudinal design. Choosing the correct covariance structure is an important step in the analysis of such data, as the covariance describes the degree of similarity among the repeated observations. 2. Three methods for choosing the covariance are: the Akaike information criterion (AIC), the quasi-information criterion (QIC), and the deviance information criterion (DIC). We compared the methods using a simulation study and using a data set that explored effects of forest fragmentation on avian species richness over 15 years. 3. The overall success was 80.6% for the AIC, 29.4% for the QIC and 81.6% for the DIC. For the forest fragmentation study the AIC and DIC selected the unstructured covariance, whereas the QIC selected the simpler autoregressive covariance. Graphical diagnostics suggested that the unstructured covariance was probably correct. 4. We recommend using DIC for selecting the correct covariance structure.
Resumo:
Background In many clinical areas, integrated care pathways are utilised as structured multidisciplinary care plans which detail essential steps in caring for patients with specific clinical problems. Particularly, care pathways for the dying have been developed as a model to improve the end-of-life care of all patients. They aim to ensure that the most appropriate management occurs at the most appropriate time and that it is provided by the most appropriate health professional. Clinical pathways for end-of-life care management are used widely around the world and have been regarded as the gold standard. Therefore, there is a significant need for clinicians to be informed about the utilisation of end-of-life care pathways with a systematic review. Objectives To assess the effects of end-of-life care pathways, compared with usual care (no pathway) or with care guided by another end-of-life care pathway across all healthcare settings (e.g. hospitals, residential aged care facilities, community). Search strategy The Cochrane Register of controlled Trials (CENTRAL), the Pain, Palliative and Supportive Care Review group specialised register,MEDLINE, EMBASE, review articles and reference lists of relevant articles were searched. The search was carried out in September 2009. Selection criteria All randomised controlled trials (RCTs), quasi-randomised trial or high quality controlled before and after studies comparing use versus non-use of an end-of-life care pathway in caring for the dying. Data collection and analysis Results of searches were reviewed against the pre-determined criteria for inclusion by two review authors. Main results The search identified 920 potentially relevant titles, but no studies met criteria for inclusion in the review. Authors’ conclusions Without further available evidence, recommendations for the use of end-of-life pathways in caring for the dying cannot be made. RCTs or other well designed controlled studies are needed for evaluating the use of end-of-life care pathways in caring for dying people.
Resumo:
This thesis argues that the end of Soviet Marxism and a bipolar global political imaginary at the dissolution of the short Twentieth Century poses an obstacle for anti-systemic political action. Such a blockage of alternate political imaginaries can be discerned by reading the work of Francis Fukuyama and "Endism" as performative invocations of the closure of political alternatives, and thus as an ideological proclamation which enables and constrains forms of social action. It is contended that the search through dialectical thought for a competing universal to posit against "liberal democracy" is a fruitless one, because it reinscribes the terms of teleological theories of history which work to effect closure. Rather, constructing a phenomenological analytic of the political conjuncture, the thesis suggests that the figure of messianism without a Messiah is central to a deconstructive reframing of the possibilities of political action - a reframing attentive to the rhetorical tone of texts. The project of recovering the political is viewed through a phenomenological lens. An agonistic political distinction must be made so as to memorialise the remainders and ghosts of progress, and thus to gesture towards an indeconstructible justice which would serve as a horizon for the articulation of an empty universal. This project is furthered by a return to a certain phenomenology inspired by Cornelius Castoriadis, Claude Lefort, Maurice Merleau-Ponty and Ernesto Laclau. The thesis provides a reading of Jacques Derrida and Walter Benjamin as thinkers of a minor universalism, a non-prescriptive utopia, and places their work in the context of new understandings of religion and the political as quasi-transcendentals which can be utilised to think through the aporias of political time in order to grasp shards of meaning. Derrida and Chantal Mouffe's deconstructive critique and supplement to Carl Schmitt's concept of the political is read as suggestive of a reframing of political thought which would leave the political question open and thus enable the articulation of social imaginary significations able to inscribe meaning in the field of political action. Thus, the thesis gestures towards a form of thought which enables rather than constrains action under the sign of justice.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Communication is one team process factor that has received considerable research attention in the team literature. This literature provides equivocal evidence regarding the role of communication in team performance and yet, does not provide any evidence for when communication becomes important for team performance. This research program sought to address this evidence gap by a) testing task complexity and team member diversity (race diversity, gender diversity and work value diversity) as moderators of the team communication — performance relationship; and b) testing a team communication — performance model using established teams across two different task types. The functional perspective was used as the theoretical framework for operationalizing team communication activity. The research program utilised a quasi-experimental research design with participants from a large multi-national information technology company whose Head Office was based in Sydney, Australia. Participants voluntarily completed two team building exercises (a decision making and production task), and completed two online questionnaires. In total, data were collected from 1039 individuals who constituted 203 work teams. Analysis of the data revealed a small number of significant moderation effects, not all in the expected direction. However, an interesting and unexpected finding also emerged from Study One. Large and significant correlations between communication activity ratings were found across tasks, but not within tasks. This finding suggested that teams were displaying very similar profiles of communication on each task, despite the tasks having different communication requirements. Given this finding, Study Two sought to a) determine the relative importance of task versus team effects in explaining variance in team communication measures for established teams; b) determine if established teams had reliable and discernable team communication profiles and if so, c) investigate whether team communication profiles related to task performance. Multi-level modeling and repeated measures analysis of variance (ANOVA) revealed that task type did not have an effect on team communication ratings. However, teams accounted for 24% of the total variance in communication measures. Through cluster analysis, five reliable and distinct team communication profiles were identified. Consistent with the findings of the multi-level analysis and repeated measures ANOVA, teams’ profiles were virtually identical across the decision making and production tasks. A relationship between communication profile and performance was identified for the production task, although not for the decision making task. This research responds to calls in the literature for a better understanding of when communication becomes important for team performance. The moderators tested in this research were not found to have a substantive or reliable effect on the relationship between communication and performance. However, the consistency in team communication activity suggests that established teams can be characterized by their communication profiles and further, that these communication profiles may have implications for team performance. The findings of this research provide theoretical support for the functional perspective in terms of the communication – performance relationship and further support the team development literature as an explanation for the stability in team communication profiles. This research can also assist organizations to better understand the specific types of communication activity and profiles of communication that could offer teams a performance advantage.
Resumo:
Low back pain is an increasing problem in industrialised countries and although it is a major socio-economic problem in terms of medical costs and lost productivity, relatively little is known about the processes underlying the development of the condition. This is in part due to the complex interactions between bone, muscle, nerves and other soft tissues of the spine, and the fact that direct observation and/or measurement of the human spine is not possible using non-invasive techniques. Biomechanical models have been used extensively to estimate the forces and moments experienced by the spine. These models provide a means of estimating the internal parameters which can not be measured directly. However, application of most of the models currently available is restricted to tasks resembling those for which the model was designed due to the simplified representation of the anatomy. The aim of this research was to develop a biomechanical model to investigate the changes in forces and moments which are induced by muscle injury. In order to accurately simulate muscle injuries a detailed quasi-static three dimensional model representing the anatomy of the lumbar spine was developed. This model includes the nine major force generating muscles of the region (erector spinae, comprising the longissimus thoracis and iliocostalis lumborum; multifidus; quadratus lumborum; latissimus dorsi; transverse abdominis; internal oblique and external oblique), as well as the thoracolumbar fascia through which the transverse abdominis and parts of the internal oblique and latissimus dorsi muscles attach to the spine. The muscles included in the model have been represented using 170 muscle fascicles each having their own force generating characteristics and lines of action. Particular attention has been paid to ensuring the muscle lines of action are anatomically realistic, particularly for muscles which have broad attachments (e.g. internal and external obliques), muscles which attach to the spine via the thoracolumbar fascia (e.g. transverse abdominis), and muscles whose paths are altered by bony constraints such as the rib cage (e.g. iliocostalis lumborum pars thoracis and parts of the longissimus thoracis pars thoracis). In this endeavour, a separate sub-model which accounts for the shape of the torso by modelling it as a series of ellipses has been developed to model the lines of action of the oblique muscles. Likewise, a separate sub-model of the thoracolumbar fascia has also been developed which accounts for the middle and posterior layers of the fascia, and ensures that the line of action of the posterior layer is related to the size and shape of the erector spinae muscle. Published muscle activation data are used to enable the model to predict the maximum forces and moments that may be generated by the muscles. These predictions are validated against published experimental studies reporting maximum isometric moments for a variety of exertions. The model performs well for fiexion, extension and lateral bend exertions, but underpredicts the axial twist moments that may be developed. This discrepancy is most likely the result of differences between the experimental methodology and the modelled task. The application of the model is illustrated using examples of muscle injuries created by surgical procedures. The three examples used represent a posterior surgical approach to the spine, an anterior approach to the spine and uni-lateral total hip replacement surgery. Although the three examples simulate different muscle injuries, all demonstrate the production of significant asymmetrical moments and/or reduced joint compression following surgical intervention. This result has implications for patient rehabilitation and the potential for further injury to the spine. The development and application of the model has highlighted a number of areas where current knowledge is deficient. These include muscle activation levels for tasks in postures other than upright standing, changes in spinal kinematics following surgical procedures such as spinal fusion or fixation, and a general lack of understanding of how the body adjusts to muscle injuries with respect to muscle activation patterns and levels, rate of recovery from temporary injuries and compensatory actions by other muscles. Thus the comprehensive and innovative anatomical model which has been developed not only provides a tool to predict the forces and moments experienced by the intervertebral joints of the spine, but also highlights areas where further clinical research is required.
Resumo:
Industrial applications of the simulated-moving-bed (SMB) chromatographic technology have brought an emergent demand to improve the SMB process operation for higher efficiency and better robustness. Improved process modelling and more-efficient model computation will pave a path to meet this demand. However, the SMB unit operation exhibits complex dynamics, leading to challenges in SMB process modelling and model computation. One of the significant problems is how to quickly obtain the steady state of an SMB process model, as process metrics at the steady state are critical for process design and real-time control. The conventional computation method, which solves the process model cycle by cycle and takes the solution only when a cyclic steady state is reached after a certain number of switching, is computationally expensive. Adopting the concept of quasi-envelope (QE), this work treats the SMB operation as a pseudo-oscillatory process because of its large number of continuous switching. Then, an innovative QE computation scheme is developed to quickly obtain the steady state solution of an SMB model for any arbitrary initial condition. The QE computation scheme allows larger steps to be taken for predicting the slow change of the starting state within each switching. Incorporating with the wavelet-based technique, this scheme is demonstrated to be effective and efficient for an SMB sugar separation process. Moreover, investigations are also carried out on when the computation scheme should be activated and how the convergence of the scheme is affected by a variable stepsize.
Resumo:
In situ near-IR transmittance measurements have been used to characterize the density of trapped electrons in dye-sensitized solar cells (DSCs). Measurements have been made under a range experimental conditions including during open circuit photovoltage decay and during recording of the IV characteristic. The optical cross section of electrons at 940 nm was determined by relating the IR absorbance to the density of trapped electrons measured by charge extraction. The value, σn = 5.4 × 10-18 cm2, was used to compare the trapped electron densities in illuminated DSCs under open and short circuit conditions in order to quantify the difference in the quasi Fermi level, nEF. It was found that nEF for the cells studied was 250 meV over wide range of illuminat on intensities. IR transmittance measurements have also been used to quantify shifts in conduction band energy associated with dye adsorption.
Resumo:
This study sought to establish and develop innovative instructional procedures, in which scaffolding can be expanded and applied, in order to enhance learning of English as a Foreign Language (EFL) writing skills in an effective hybrid learning community (a combination of face-to-face and online modes of learning) at the university where the researcher is working. Many educational experts still believe that technology has not been harnessed to its potential to meet the new online characteristics and trends. There is also an urgency to reconsider the pedagogical perspectives involved in the utilisation of online learning systems in general and the social interactions within online courses in particular that have been neglected to date. An action research design, conducted in two cycles within a duration of four months, was utilised throughout this study. It was intended not only to achieve a paradigm shift from transmission-absorption to socio-constructivist teaching/learning methodologies but also to inform practice in these technology-rich environments. Five major findings emerged from the study. First, the scaffolding theory has been extended. Two new scaffolding types (i.e., quasi-transcendental scaffolding and transcendental scafolding), two scaffolding aspects (i.e., receptive and productive) and some scaffolding actions (e.g., providing a stimulus, awareness, reminder, or remedy) for EFL writing skills in an effective hybrid learning community have been identified and elaborated on. Second, the EFL ‘Effective Writing’ students used the scaffolds implemented in a hybrid environment to enhance and enrich their learning of writing of English essays. The online activities, conducted after the F2F sessions most of the time, gave students greater opportunities to both reinforce and expand the knowledge they had acquired in the F2F mode. Third, a variety of teaching techniques, different online tasks and discussion topics utilised in the two modes bolstered the students’ interests and engagement in their knowledge construction of how to compose English-language essays. Fourth, through the scaffolded activities, the students learned how to scaffold themselves and thus became independent learners in their future endeavours of constructing knowledge. Fifth, the scaffolding-to-scaffold activities provided the students with knowledge on how to effectively engage in transcendental scaffolding actions and facilitate the learning of English writing skills by less able peers within the learning community. Thus, the findings of this current study extended earlier understandings of scaffolding in an EFL hybrid learning environment and will contribute to the advancement of future ICT-mediated courses in terms of their scaffolding pedagogical aspects.