《麻省理工科技評論 MIT Tech》8/1
* 【改善空氣污染能降低患阿茲海默症風險】
根據 7/26 日在美國丹佛舉行的 2021 年阿茲海默病協會國際會議上發佈的多項研究報告,改善空氣污染會改善認知功能,降低阿茲海默症風險。此前報告曾顯示,長期暴露於空氣污染與阿爾茨海默病相關腦斑有關。而此次會議是第一次累計證據表明,減少污染,特別是空氣中的細顆粒物和燃料燃燒產生的污染物,與降低全因失智症和阿茲海默症風險有關。
* 【MIT科學家研究了如何減少一次性口罩對環境的影響】
據估計,COVID-19大流行期間每天產生多達7200噸的醫療廢物,其中大部分是一次性口罩。近日,麻省理工學院(MIT)的一項新研究指出,通過採用可重復使用的口罩可以大大減少這一損失,該研究計算了幾種不同的口罩使用方案的財務和環境成本。研究人員表示,完全可重復使用的硅膠N95口罩能更大程度地減少浪費,而他們現在正致力於開發這種新型口罩。目前,這項研究已經刊登在《British Medical Journal》上。
* 【新發明的的尿液或血液測試方法可以發現腦腫瘤】
劍橋大學的醫學研究人員開發了兩種新的測試方法,能夠檢測最惡的腦癌膠質瘤。使用新開發的測試可以在病人的尿液或血漿中檢測到腫瘤,這也是世界上第一個此類測試方式。
* 【歐洲科學家開發出可低成本製造發光材料的新技術】
劍橋大學和慕尼黑工業大學領導的研究人員發現,通過將一種材料的每 1000 個原子中的一個換成另一個,他們能夠將一種被稱為鹵化物鈣鈦礦的新材料類發光體的發光能力提高兩倍。該發現有益於製造更有效的低成本發光材料,這些材料具有柔性,並可使用噴墨技術列印。相關研究發表於《美國化學會志》。
* 【哈佛科學家發起伽利略項目,致力尋找宇宙中的外星科技文明】
哈佛帶領的一支科學家團隊,已經發起了一個旨在宇宙中尋找外星生命證據的伽利略項目(Galileo Project)。結合地面望遠鏡、人工智能等方案,這項研究將著重於外星智能的物理例證,而不是源自遙遠文明的電磁信號。
* 【科學家發現潛在療法能提高人類免疫系統在體內搜索和消滅癌細胞的能力】
近日,南安普敦大學和米蘭國家分子遺傳學研究所的研究人員發現了一種潛在的治療方法,可以提高人類免疫系統在體內搜索和消滅癌細胞的能力。研究人員表示,他們已經確定了一種限制調節免疫系統的一組細胞的活動的方法,這反過來可以釋放其他免疫細胞來攻擊癌症患者的腫瘤。目前,這項研究已經發表於《PNAS》。
* 【美國研究團隊在太陽能制氫方面獲得新突破】
數十年來,世界各地的研究人員一直在尋找利用太陽能來制氫的關鍵反應方法,即如何將水分子分解成氫氣和氧氣。儘管大多數努力以失敗而告終,且少數成果也面臨著成本過高的尷尬。德克薩斯大學奧斯汀分校的一支研究團隊,還是設法找到了一種通過厚二氧化硅層來創建導電路徑的方法來有效從水中分離氧分子。該方案能夠低成本地運用,並擴展到大批量生產流程中。有關這項研究的詳情,已經發表在近日出版的《Nature Communications》期刊上。
* 【現近 20% 的原始森林景觀與採礦、石油和天然氣等採掘業特許地相重疊】
國際野生生物保護學會(WCS)和世界自然基金會(WWF)的一項新研究顯示,近 20% 的熱帶原始森林景觀(IFLs)與採礦、石油和天然氣等採掘業的特許地相重疊。重疊的總面積約為97.5萬平方公里,大約相當於埃及的面積。採掘業特許地與熱帶國際森林公園重疊最多,佔總面積的 11.33%,而石油和天然氣特許地的重疊面積佔總面積的 7.85%。該研究發表在《森林與全球變化》上。
* 【MIT研究人員用紅外攝像機和人工智能來預測「沸騰危機」】
最近,麻省理工學院(MIT)核科學與工程系的研究人員,通過訓練一個神經網絡模型來預測「沸騰危機」。研究人員表示,該模型能夠從具有不同形態和潤濕性(或吸濕性)的表面上的氣泡動力學的高分辨率紅外測量中預測沸騰危機的餘量(即偏離核沸騰比,DNBR)。這項研究成果或將應用於冷卻計算機芯片和核反應堆。目前,該研究已經發表於《Applied Physics Letters》。
* 【英國研究人員使用一種創新方法來「逆轉」與年齡有關的記憶衰退】
英國研究人員的一項新研究提出了一種創新的方法來治療與年齡有關的記憶衰退。臨床前研究顯示,通過「操縱」大腦中被稱為神經元周圍基質網絡(PNNs)的結構組成,可以逆轉衰老小鼠的記憶衰退。
* 【中國科學家利用簡單的 RNA 微調讓馬鈴薯和水稻產量提高 50%】
北京大學的研究小組將一種叫做 FTO 的單一基因插入到馬鈴薯和水稻植株中。由此產生的植物是更有效的光合作用者,這意味著它們長得更大,產量也更高 —— 在實驗室中產量提高了 3 倍,在田間產量提高了 50%。它們還能長出更長的根系,這有助於它們更好地忍受乾旱。
* 【歐盟提出一攬子應對氣候變化方案】
歐盟委員會近日提出應對氣候變化的一攬子計劃提案,旨在實現到 2030 年歐盟溫室氣體淨排放量與 1990 年的水平相比至少減少 55%,進而到 2050 年實現碳中和的目標。這份提案涉及交通、能源、建築、農業和稅收政策等諸多領域,具體內容包括收緊現有碳排放交易體系,增加可再生能源的使用,提高能源效率,盡快推出低碳運輸方式及相關配套基礎設施和燃料,制定與脫碳目標相一致的稅收政策等。
* 時間晶體即將誕生?當地時間 7 月 28 日,谷歌在一篇預印本論文中表示,其首次使用 「懸鈴木」 (Sycamore)量子計算機創造出了 「真正的時間晶體」。
參與該研究的科學家超過 80 人,分別來自Stanford 大學、普林斯頓大學、MIT 和德國德累斯頓馬普固體化學物理學研究所(德累斯頓)等科研院所,論文標題為《在量子處理器上觀測時間晶體的本徵態序》(Observation of Time-Crystalline Eigenstate Order on a Quantum Processor )。
* 【新分子圖譜揭示腦細胞發育軌跡】
瑞士洛桑聯邦理工學院(EPFL)和瑞典卡羅林斯卡學院的研究人員首次繪制了胚胎大腦細胞在成熟過程中遵循的遺傳和發育軌跡。這份分子圖譜不僅可幫助人們識別與神經發育狀況有關的基因,確定腦癌中惡性細胞的來源,還可以作為評估實驗室中乾細胞產生的腦組織的參考,同時能改進神經退行性疾病的細胞替代療法。相關研究發表在近日的《自然》雜誌上。
* 【液體填充光纖設計可實現更可靠的數據傳輸】
瑞士 Empa 研究所的研究人員開發了一種光纖,該光纖由連續的液體甘油芯和透明含氟聚合物護套組成。這種光纖以光脈衝的形式傳輸數據的能力跟固體塑料光纖差不多,另外它還擁有更高的抗拉強度。
journal on communications 在 Sheila Sim沈琳宸 Facebook 的精選貼文
Getting sentimental as i write this post.
Thank you so much for journeying with @jadeseah, @the_positive_movement and i through this 6 weeks of #CircuitBreaker. Big thanks for regulars who'd joined us right from the beginning, when we were still trying to figure our way out and unsure if what we did was going to be good. And thank you for people who had walked with us, provided their generous and precious feedbacks and shared their stories with us.
In this 6 weeks, we've shared so much conversations and tools on anxiety, boundaries, communications, decluttering of physical and emotional space, conflict management, relaxation techniques, etc. We've reflected and journal on our experiences and stories. Thank you for listening. And most importantly, thank you for being vulnerable. Thank you for making this community a safe one. Thank you for giving us your trust. You've all been so brave and courageous!
The existance for #WonderandWellness was never just a place for chatting and sharing. Our goal is to equip everyone who'd joined with useful tools and skillsets that they can takeaway from for the rest of their lives.
Before u come to the session tomorrow(Sat) at 9pm, have a think about this.
"What is one lesson that Covid19 has taught you that you always want to remember?"
We have talked for 6 weeks - as we come to an end for PJ party and BYO(Lunch), we world like to hear you instead. Share with us what's the biggest take away for this #CircuitBreaker and #WonderandWellness.
Looking forward to our session tomorrow!
By now you know the drill - come in your pyjamas, a hot drink, and writing materials. Tomorrow's session will run from 9pm-1030pm! See you!
#sheilaloveherlife #throwback #dwtakesthebestphotosofme #wonderandwellness #ssfoodforthoughts #sspositivepsychology #positivepsychology #positivepsychology #stayhome #emotionalhealing
journal on communications 在 國立陽明交通大學電子工程學系及電子研究所 Facebook 的最佳貼文
【演講】2019/11/19 (二) @工四816 (智易空間),邀請到Prof. Geoffrey Li(Georgia Tech, USA)與Prof. Li-Chun Wang(NCTU, Taiwan) 演講「Deep Learning based Wireless Resource Allocation/Deep Learning in Physical Layer Communications/Machine Learning Interference Management」
IBM中心特別邀請到Prof. Geoffrey Li(Georgia Tech, USA)與Prof. Li-Chun Wang(NCTU, Taiwan)前來為我們演講,歡迎有興趣的老師與同學報名參加!
演講標題:Deep Learning based Wireless Resource Allocation/Deep Learning in Physical Layer Communications/Machine Learning Interference Management
演 講 者:Prof. Geoffrey Li與Prof. Li-Chun Wang
時 間:2019/11/19(二) 9:00 ~ 12:00
地 點:交大工程四館816 (智易空間)
活動報名網址:https://forms.gle/vUr3kYBDB2vvKtca6
報名方式:
費用:(費用含講義、午餐及茶水)
1.費用:(1) 校內學生免費,校外學生300元/人 (2) 業界人士與老師1500/人
2.人數:60人,依完成報名順序錄取(完成繳費者始完成報名程序)
※報名及繳費方式:
1.報名:請至報名網址填寫資料
2.繳費:
(1)親至交大工程四館813室完成繳費(前來繳費者請先致電)
(2)匯款資訊如下:
戶名: 曾紫玲(國泰世華銀行 竹科分行013)
帳號: 075506235774 (國泰世華銀行 竹科分行013)
匯款後請提供姓名、匯款時間以及匯款帳號後五碼以便對帳
※將於上課日發放課程繳費領據
聯絡方式:曾紫玲 Tel:03-5712121分機54599 Email:tzuling@nctu.edu.tw
Abstract:
1.Deep Learning based Wireless Resource Allocation
【Abstract】
Judicious resource allocation is critical to mitigating interference, improving network efficiency, and ultimately optimizing wireless network performance. The traditional wisdom is to explicitly formulate resource allocation as an optimization problem and then exploit mathematical programming to solve it to a certain level of optimality. However, as wireless networks become increasingly diverse and complex, such as high-mobility vehicular networks, the current design methodologies face significant challenges and thus call for rethinking of the traditional design philosophy. Meanwhile, deep learning represents a promising alternative due to its remarkable power to leverage data for problem solving. In this talk, I will present our research progress in deep learning based wireless resource allocation. Deep learning can help solve optimization problems for resource allocation or can be directly used for resource allocation. We will first present our research results in using deep learning to solve linear sum assignment problems (LSAP) and reduce the complexity of mixed integer non-linear programming (MINLP), and introduce graph embedding for wireless link scheduling. We will then discuss how to use deep reinforcement learning directly for wireless resource allocation with application in vehicular networks.
2.Deep Learning in Physical Layer Communications
【Abstract】
It has been demonstrated recently that deep learning (DL) has great potentials to break the bottleneck of the conventional communication systems. In this talk, we present our recent work in DL in physical layer communications. DL can improve the performance of each individual (traditional) block in the conventional communication systems or jointly optimize the whole transmitter or receiver. Therefore, we can categorize the applications of DL in physical layer communications into with and without block processing structures. For DL based communication systems with block structures, we present joint channel estimation and signal detection based on a fully connected deep neural network, model-drive DL for signal detection, and some experimental results. For those without block structures, we provide our recent endeavors in developing end-to-end learning communication systems with the help of deep reinforcement learning (DRL) and generative adversarial net (GAN). At the end of the talk, we provide some potential research topics in the area.
3.Machine Learning Interference Management
【Abstract】
In this talk, we discuss how machine learning algorithms can address the performance issues of high-capacity ultra-dense small cells in an environment with dynamical traffic patterns and time-varying channel conditions. We introduce a bi adaptive self-organizing network (Bi-SON) to exploit the power of data-driven resource management in ultra-dense small cells (UDSC). On top of the Bi-SON framework, we further develop an affinity propagation unsupervised learning algorithm to improve energy efficiency and reduce interference of the operator deployed and the plug-and-play small cells, respectively. Finally, we discuss the opportunities and challenges of reinforcement learning and deep reinforcement learning (DRL) in more decentralized, ad-hoc, and autonomous modern networks, such as Internet of things (IoT), vehicle -to-vehicle networks, and unmanned aerial vehicle (UAV) networks.
Bio:
Dr. Geoffrey Li is a Professor with the School of Electrical and Computer Engineering at Georgia Institute of Technology. He was with AT&T Labs – Research for five years before joining Georgia Tech in 2000. His general research interests include statistical signal processing and machine learning for wireless communications. In these areas, he has published around 500 referred journal and conference papers in addition to over 40 granted patents. His publications have cited by 37,000 times and he has been listed as the World’s Most Influential Scientific Mind, also known as a Highly-Cited Researcher, by Thomson Reuters almost every year since 2001. He has been an IEEE Fellow since 2006. He received 2010 IEEE ComSoc Stephen O. Rice Prize Paper Award, 2013 IEEE VTS James Evans Avant Garde Award, 2014 IEEE VTS Jack Neubauer Memorial Award, 2017 IEEE ComSoc Award for Advances in Communication, and 2017 IEEE SPS Donald G. Fink Overview Paper Award. He also won the 2015 Distinguished Faculty Achievement Award from the School of Electrical and Computer Engineering, Georgia Tech.
Li-Chun Wang (M'96 -- SM'06 -- F'11) received Ph. D. degree from the Georgia Institute of Technology, Atlanta, in 1996. From 1996 to 2000, he was with AT&T Laboratories, where he was a Senior Technical Staff Member in the Wireless Communications Research Department. Currently, he is the Chair Professor of the Department of Electrical and Computer Engineering and the Director of Big Data Research Center of of National Chiao Tung University in Taiwan. Dr. Wang was elected to the IEEE Fellow in 2011 for his contributions to cellular architectures and radio resource management in wireless networks. He was the co-recipients of IEEE Communications Society Asia-Pacific Board Best Award (2015), Y. Z. Hsu Scientific Paper Award (2013), and IEEE Jack Neubauer Best Paper Award (1997). He won the Distinguished Research Award of Ministry of Science and Technology in Taiwan twice (2012 and 2016). He is currently the associate editor of IEEE Transaction on Cognitive Communications and Networks. His current research interests are in the areas of software-defined mobile networks, heterogeneous networks, and data-driven intelligent wireless communications. He holds 23 US patents, and have published over 300 journal and conference papers, and co-edited a book, “Key Technologies for 5G Wireless Systems,” (Cambridge University Press 2017).