Hypercrosslinked polymers (HCPs) are currently receiving great interest due to their easy preparation, high chemical and thermal stability, and low cost. Combined with the lightweight properties and high surface areas HCPs can be considered as promising materials for gas storage and separation, catalysis, and heavy metal ions removal in wastewater treatment. This Feature Article summarizes strategies for the preparation of HCPs, comprising the post‐crosslinking of “Davankov‐type” resins, direct polycondensation of aromatic chloromethyl (or hydroxymethyl) monomers, and knitting aromatic compound polymers (KAPs). The HCPs applications, such as H2 storage, CO2 capture, and heterogeneous catalysis, are also discussed throughout in the article. Finally, the outlook of this research area is given. 相似文献
The term hydrogel describes a type of soft and wet material formed by cross‐linked hydrophilic polymers. The distinct feature of hydrogels is their ability to absorb a large amount of water and swell. The properties of a hydrogel are usually determined by the chemical properties of their constituent polymer(s). However, a group of hydrogels, called “smart hydrogels,” changes properties in response to environmental changes or external stimuli. Recently, DNA or DNA‐inspired responsive hydrogels have attracted considerable attention in construction of smart hydrogels because of the intrinsic advantages of DNA. As a biological polymer, DNA is hydrophilic, biocompatible, and highly programmable by Watson‐Crick base pairing. DNA can form a hydrogel by itself under certain conditions, and it can also be incorporated into synthetic polymers to form DNA‐hybrid hydrogels. Functional DNAs, such as aptamers and DNAzymes, provide additional molecular recognition capabilities and versatility. In this Review, DNA‐based hydrogels are discussed in terms of their stimulus response, as well as their applications.
Poly(N‐isopropylacrylamide)‐block‐poly(l ‐lactic acid)‐block‐poly(N‐isopropylacrylamide) (PNIPAAM‐b‐PLLA‐b‐PNIPAAM) and PNIPAAM‐b‐PDLA‐b‐PNIPAAM triblock copolymers with varying polylactic acid (PLA) lengths are synthesized using a combination of ring‐opening polymerization and atom‐transfer radical polymerization. Results of 1H NMR and gel permeation chromatography analyses show that the copolymers have a well‐defined triblock structure and the PLA segment lengths can be readily controlled with monomer feed ratio. Stereocomplexation between the enantiomeric PLA segments is confirmed with differential scanning calorimetry and wide‐angle X‐ray scattering. Dynamic light scattering experiments show that (1) the LCST of PNIPAAM in water could be tailored from 32 °C up to 38.5 °C by increasing the length of PLA segments and mixing copolymers of similar molecular weight with enantiomeric PLA segments to induce stereocomplexation, and (2) the LCST of each mixed copolymer system could be tailored within a 2–3 °C range of body temperature by manipulating the ratio of the enantiomeric copolymers in solution.
The network structure entropy has served as one of the index measuring network heterogeneity, but it gives no considerations to the impact of isolated nodes on the network structure. In addition, the all-terminal reliability is zero and is unable to compare it between disconnected networks. Therefore, the concept of network connectivity entropy is suggested to remove the current bottleneck and helps facilitate new index in terms of network connectivity reliability. This study fully proves the rules as follows: when the edges of network are diminishing, the newly-established network connectivity reliability will remain unchanged or become weaker; conversely, when the edges of network are increasing, the network connectivity reliability will remain unchanged or become stronger. Thus, the proposed index of network connectivity reliability is proved reasonable. Furthermore, the impaired metro network of Nanjing city is exemplified to demonstrate the validity and practicability of network connectivity reliability. The result shows that this new approach is in good position to compute network connectivity reliability quickly and effectively, and also to compare it between different networks. 相似文献
We investigate the zero dissipation limit problem of the one-dimensional compressible isentropic Navier-Stokes equations with Riemann initial data in the case of the composite wave of two shock waves.It is shown that the unique solution to the Navier-Stokes equations exists for all time,and converges to the Riemann solution to the corresponding Euler equations with the same Riemann initial data uniformly on the set away from the shocks,as the viscosity vanishes.In contrast to previous related works,where either the composite wave is absent or the efects of initial layers are ignored,this gives the frst mathematical justifcation of this limit for the compressible isentropic Navier-Stokes equations in the presence of both composite wave and initial layers.Our method of proof consists of a scaling argument,the construction of the approximate solution and delicate energy estimates. 相似文献
Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π1, is used to estimate an expectation with respect to another, π. The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π1 is replaced by a Harris ergodic Markov chain with invariant density π1, then the resulting estimator remains strongly consistent. There is a price to be paid, however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this article, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general setup, where we assume that Markov chain samples from several probability densities, π1, …, πk, are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effect models under different priors. The second involves Bayesian variable selection in linear regression, and for this application, importance sampling based on multiple chains enables an empirical Bayes approach to variable selection. 相似文献
We use a pure energy method recently developed by Guo and Wang to prove the optimal time decay rates of the solutions to the compressible magnetohydrodynamic equations in the whole space. In particular, the optimal decay rates of the higher-order spatial derivatives of solutions are obtained. 相似文献