937 resultados para Block grants.
Resumo:
Several recently proposed ciphers, for example Rijndael and Serpent, are built with layers of small S-boxes interconnected by linear key-dependent layers. Their security relies on the fact, that the classical methods of cryptanalysis (e.g. linear or differential attacks) are based on probabilistic characteristics, which makes their security grow exponentially with the number of rounds N r r. In this paper we study the security of such ciphers under an additional hypothesis: the S-box can be described by an overdefined system of algebraic equations (true with probability 1). We show that this is true for both Serpent (due to a small size of S-boxes) and Rijndael (due to unexpected algebraic properties). We study general methods known for solving overdefined systems of equations, such as XL from Eurocrypt’00, and show their inefficiency. Then we introduce a new method called XSL that uses the sparsity of the equations and their specific structure. The XSL attack uses only relations true with probability 1, and thus the security does not have to grow exponentially in the number of rounds. XSL has a parameter P, and from our estimations is seems that P should be a constant or grow very slowly with the number of rounds. The XSL attack would then be polynomial (or subexponential) in N r> , with a huge constant that is double-exponential in the size of the S-box. The exact complexity of such attacks is not known due to the redundant equations. Though the presented version of the XSL attack always gives always more than the exhaustive search for Rijndael, it seems to (marginally) break 256-bit Serpent. We suggest a new criterion for design of S-boxes in block ciphers: they should not be describable by a system of polynomial equations that is too small or too overdefined.
Resumo:
The deformation of a rectangular block into an annular wedge is studied with respect to the state of swelling interior to the block. Nonuniform swelling fields are shown to generate these flexure deformations in the absence of resultant forces and bending moments. Analytical expressions for the deformation fields demonstrate these effects for both incompressible and compressible generalizations of conventional hyperelastic materials. Existing results in the absence of a swelling agent are recovered as special cases.
Resumo:
PURPOSE To compare diffusion-weighted functional magnetic resonance imaging (DfMRI), a novel alternative to the blood oxygenation level-dependent (BOLD) contrast, in a functional MRI experiment. MATERIALS AND METHODS Nine participants viewed contrast reversing (7.5 Hz) black-and-white checkerboard stimuli using block and event-related paradigms. DfMRI (b = 1800 mm/s2 ) and BOLD sequences were acquired. Four parameters describing the observed signal were assessed: percent signal change, spatial extent of the activation, the Euclidean distance between peak voxel locations, and the time-to-peak of the best fitting impulse response for different paradigms and sequences. RESULTS The BOLD conditions showed a higher percent signal change relative to DfMRI; however, event-related DfMRI showed the strongest group activation (t = 21.23, P < 0.0005). Activation was more diffuse and spatially closer to the BOLD response for DfMRI when the block design was used. DfMRIevent showed the shortest TTP (4.4 +/- 0.88 sec). CONCLUSION The hemodynamic contribution to DfMRI may increase with the use of block designs.
Resumo:
Unique features and benefits of the plasma-aided nanofabrication are considered by using the "plasma-building block" approach, which is based on plasma diagnostics and nanofilm characterization, cross-referenced by numerical simulation of generation and dynamics of building blocks in the gas phase, their interaction with nanostructured surfaces, and ab initio simulation of chemical structure of relevant nanoassemblies. The examples include carbon nanotip microemitter structures, semiconductor quantum dots and nanowires synthesized in the integrated plasma-aided nanofabrication facility.
Resumo:
Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.
Resumo:
Critical stage in open-pit mining is to determine the optimal extraction sequence of blocks, which has significant impacts on mining profitability. In this paper, a more comprehensive block sequencing optimisation model is developed for the open-pit mines. In the model, material characteristics of blocks, grade control, excavator and block sequencing are investigated and integrated to maximise the short-term benefit of mining. Several case studies are modeled and solved by CPLEX MIP and CP engines. Numerical investigations are presented to illustrate and validate the proposed methodology.
Resumo:
New public management (NPFM), with its hands-on, private sector-style performance measurement, output control, parsimonious use of resources, disaggreation of public sector units and greater competition in the public sector, has significantly affected charitable and nonprofit organisations delivering community services (Hood, 1991; Dunleavy, 1994; George & Wilding, 2002). The literature indicates that nonprofit organisations under NPM believe they are doing more for less: while administration is increasing, core costs are not being met; their dependence on government funding comes at the expense of other funding strategies; and there are concerns about proportionality and power asymmetries in the relationship (Kerr & Savelsberg, 2001; Powell & Dalton, 2011; Smith, 2002, p. 175; Morris, 1999, 2000a). Government agencies are under increased pressure to do more with less, demonstrate value for money, measure social outcomes, not merely outputs and minimise political risk (Grant, 2008; McGreogor-Lowndes, 2008). Government-community service organisation relationships are often viewed as 'uneasy alliances' characterised by the pressures that come with the parties' differing roles and expectations and the pressures that come with the parties' differing roles and expectations and the pressurs of funding and security (Productivity Commission, 2010, p. 308; McGregor-Lowndes, 2008, p. 45; Morris, 200a). Significant community services are now delivered to citizens through such relationships, often to the most disadvantaged in the community, and it is important for this to be achieved with equity, efficiently and effectively. On one level, the welfare state was seen as a 'risk management system' for the poor, with the state mitigating the risks of sickness, job loss and old age (Giddens, 1999) with the subsequent neoliberalist outlook shifting this risk back to households (Hacker, 2006). At the core of this risk shift are written contracts. Vincent-Jones (1999,2006) has mapped how NPM is characterised by the use of written contracts for all manner of relations; e.g., relgulation of dealings between government agencies, between individual citizens and the state, and the creation of quais-markets of service providers and infrastructure partners. We take this lens of contracts to examine where risk falls in relation to the outsourcing of community services. First we examine the concept of risk. We consider how risk might be managed and apportioned between governments and community serivce organisations (CSOs) in grant agreements, which are quasiy-market transactions at best. This is informed by insights from the law and economics literature. Then, standard grant agreements covering several years in two jurisdictions - Australia and the United Kingdom - are analysed, to establish the risk allocation between government and CSOs. This is placed in the context of the reform agenda in both jurisdictions. In Australia this context is th enonprofit reforms built around the creation of a national charities regulator, and red tape reduction. In the United Kingdom, the backdrop is the THird Way agenda with its compacts, succeed by Big Society in a climate of austerity. These 'case studies' inform a discussion about who is best placed to bear and manage the risks of community service provision on behalf of government. We conclude by identifying the lessons to be learned from our analysis and possible pathways for further scholarship.
Resumo:
Measurements of half-field beam penumbra were taken using EBT2 film for a variety of blocking techniques. It was shown that minimizing the SSD reduces the penumbra as the effects of beam divergence are diminished. The addition of a lead block directly on the surface provides optimal results with a 10-90% penumbra of 0.53 ± 0.02 cm. To resolve the uncertainties encountered in film measurements, future Monte Carlo measurements of halffield penumbras are to be conducted.
Resumo:
The combination of thermally- and photochemically-induced polymerization using light sensitive alkoxyamines was investigated. The thermally driven polymerizations were performed via the cleavage of the alkoxyamine functionality, whereas the photochemically-induced polymerizations were carried out either by nitroxide mediated photo-polymerization (NMP2) or by a classical type II mechanism, depending on the structure of the light-sensitive alkoxyamine employed. Once the potential of the various structures as initiators of thermally- and photo-induced polymerizations was established, their use in combination for block copolymer syntheses was investigated. With each alkoxyamine investigated, block copolymers were successfully obtained and the system was applied to the post-modification of polymer coatings for application in patterning and photografting.
Resumo:
A thiophene–tetrafluorophenyl–thiophene donor–acceptor–donor building block was used in combination with a furan-substituted diketopyrrolopyrrole for synthesizing the polymer semiconductor, PDPPF-TFPT. Due to the balance of tetrafluorophenylene/diketopyrrolopyrrole electron-withdrawing and furan/thiophene electron-donating moieties in the backbone, PDPPF-TFPT exhibits ambipolar behaviour in organic thin-film transistors, with hole and electron mobilities as high as 0.40 cm2 V−1 s−1 and 0.12 cm2 V−1 s−1.
Resumo:
Advances in nanomaterials/nanostructures offer the possibility of fabricating multifunctional materials for use in engineering applications. Carbon nanotube (CNT)-based nanostructures are a representative building block for these multifunctional materials. Based on a series of in silico studies, we investigated the possibility of tuning the thermal conductivity of a three-dimensional CNT-based nanostructure: a single-walled CNT-based super-nanotube. The thermal conductivity of the super-nanotubes was shown to vary with different connecting carbon rings and super-nanotubes with longer constituent single-walled CNTs and larger diameters had a smaller thermal conductivity. The inverse of the thermal conductivity of the super-nanotubes showed a good linear relationship with the inverse of the length. The thermal conductivity was approximately proportional to the inverse of the temperature, but was insensitive to the axial strain as a result of the Poisson ratio. These results provide a fundamental understanding of the thermal conductivity of the super-nanotubes and will guide their future design/fabrication and engineering applications.
Resumo:
The hemodynamic response function (HRF) describes the local response of brain vasculature to functional activation. Accurate HRF modeling enables the investigation of cerebral blood flow regulation and improves our ability to interpret fMRI results. Block designs have been used extensively as fMRI paradigms because detection power is maximized; however, block designs are not optimal for HRF parameter estimation. Here we assessed the utility of block design fMRI data for HRF modeling. The trueness (relative deviation), precision (relative uncertainty), and identifiability (goodness-of-fit) of different HRF models were examined and test-retest reproducibility of HRF parameter estimates was assessed using computer simulations and fMRI data from 82 healthy young adult twins acquired on two occasions 3 to 4 months apart. The effects of systematically varying attributes of the block design paradigm were also examined. In our comparison of five HRF models, the model comprising the sum of two gamma functions with six free parameters had greatest parameter accuracy and identifiability. Hemodynamic response function height and time to peak were highly reproducible between studies and width was moderately reproducible but the reproducibility of onset time was low. This study established the feasibility and test-retest reliability of estimating HRF parameters using data from block design fMRI studies.