Conversational AI TensorFlow.js NLP examples with Wechaty SDK and Angular
Huan (李卓桓)
You can apply a Windows / Pad protocol token from our puppet service providers:
Copy the following shell script and then paste it into the term of your server, to setup your Wechaty token:
# learn how to DIY a Wechaty Puppet Service token at http://wechaty.js.org/docs/puppet-services/diy
export WECHATY_TOKEN=insecure_wechaty_puppet_service_token_diy
# Set port for your hostie service: must be published accessible on the internet
# Wechaty IO Client use this port to publish the Puppet Service
export WECHATY_PUPPET_SERVER_PORT=48788
# learn more about Wechaty Puppet PadLocal at https://wechaty.js.org/docs/puppet-services/padlocal
export WECHATY_PUPPET=wechaty-puppet-padlocal
# get a 7 days free token at PadLocal official website: http://pad-local.com/
export WECHATY_PUPPET_PADLOCAL_TOKEN=YOUR_PADLOCAL_TOKEN_AT_HERE
export WECHATY_LOG=verbose
docker run \
--rm \
-ti \
-e WECHATY_LOG \
-e WECHATY_PUPPET \
-e WECHATY_PUPPET_PADLOCAL_TOKEN \
-e WECHATY_PUPPET_SERVER_PORT \
-e WECHATY_TOKEN \
-p "$WECHATY_PUPPET_SERVER_PORT" \
wechaty/wechaty:0.78
Learn more: Puppet Service: DIY This guide will help you generate a Wechaty Token for connecting to the Wechaty Puppet Service.
We have four steps in our live coding, they are saved in four separate branches for easy loading and testing.
ng new my-app
Branch: ng_china_2020_step_1_ng_new_my-app
npx --package @angular/cli ng new my-app
cd my-app
ng serve --open
Learn more from https://angular.io/guide/setup-local
Branch: ng_china_2020_step_2_wechaty
npm i @chatie/angular brolog
app.module.ts
import { WechatyModule } from '@chatie/angular'
@NgModule({
imports: [
WechatyModule,
...
],
...
app.component.html
<wechaty
#wechaty
token="insecure_wechaty_puppet_service_token_diy"
(heartbeat) = "onHeartbeat($event)"
(scan) = "onScan($event)"
(login) = "wechaty.startSyncMessage(); onLogin($event)"
(message) = "onMessage($event)"
>
</wechaty>
Branch: ng_china_2020_step_3_toxicity
npm install @tensorflow/tfjs
npm install @tensorflow-models/toxicity
ng generate service toxicity
Learn more:
The traffic light code is copy/pasted from this great tutorial: Stop in the Name of the Traffic Light
To be written.
Branch: step_4_tensorflow-models_qna
npm install @tensorflow-models/qna
// to be written
Learn more:
November 21 - 22 @online
Knowledge, ideas, and insights for the Next Generation
Google Slides https://docs.google.com/presentation/d/1Gd3D8bS6OifXDsdSe0x5i6XsP_uISX3W9tR8yBA0mYs/edit?usp=sharing
Talk Video: https://youtu.be/SACugbTNQnc
Huan LI (李卓桓), Google Machine Learning Developer Expert, zixia@zixia.net
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.