C4A Research Institute

Computing for All through Linguistic Interaction Technologies

We are conducting research and development on dialogue systems technologies and related technologies such as natural language processing, artificial intelligence, and human-machine interactions. Dialogue systems such as chatbots and personal assistants have been getting used but we think there are many new applications of dialogue system technologies. We are aiming at allowing all people to easily use information technologies through the applied research on dialogue systems with collaborating with the industry and academia. We also try to teach computer science and information technologies through the research and development of dialogue systems technologies.

「AIの本当の面白さ」プログラムを書くよりまず試せ -ディープラーニングでのノーコードの重要性

ディープラーニング技術の本当の面白さとはなんでしょうか。精度を競い合うだけの戦いに終始していませんか。

今、DataRobot社が提供するDataRobotやSonyが提供するNeural Network Consoleをはじめ、プログラミングを伴わずにAI(機械学習)のモデルを構築するノーコードのサービスが多く登場しています。

AINOWでは50以上のプログラミング不要なAI構築ツールを記事としてまとめています。

対話でよみがえった戦争体験者の #記憶の色 を再現することを大切にしてきました。

RT @anjuniwata_: ご紹介ありがとうございました!対話でよみがえった戦争体験者の #記憶の色 を再現することを大切にしてきました。展覧会や映像作品、アプリ、書籍などが、一人でも多くの方に、戦争や平和について #自分ごと として想像するきっかけになることを願っています!
#戦争を伝える10代
#記憶の解凍 https://twitter.com/ntvnewszero/status/1292839267471732737

“What’s new in TensorFlow 2.3?”

TensorFlow 2.3 has been released! The focus of this release is on new tools to make it easier for you to load and preprocess data, and to solve input-pipeline bottlenecks, whether you’re working on one machine, or many.

tf.data adds two mechanisms to solve input pipeline bottlenecks and improve resource utilization. For advanced users, the new service API provides a way to improve training speed when the host attached to a training device can’t keep up with the data consumption needs of your model. It allows you to offload input preprocessing to a CPU cluster of data-processing workers that run alongside your training job, increasing accelerator utilization. A second new feature is the tf.data snapshot API, which allows you to persist the output of your input preprocessing pipeline to disk, so you can reuse it on a different training run. This enables you to trade storage space to free up additional CPU time…

Prepare for Artificial Intelligence to Produce Less Wizardry

A new paper argues that the computing demands of deep learning are so great that progress on tasks like translation and self-driving is likely to slow.

EARLY LAST YEAR, a large European supermarket chain deployed artificial intelligence to predict what customers would buy each day at different stores, to help keep shelves stocked while reducing costly spoilage of goods.

The company already used purchasing data and a simple statistical method to predict sales. With deep learning, a technique that has helped produce spectacular AI advances in recent years—as well as additional data, including local weather, traffic conditions, and competitors’ actions—the company cut the number of errors by three-quarters…