How to get video from applications, like a web browser, into Syphon.
This example shows how to get a Youtube video from Firefox into Max/MSP Vizzie.
Use NDI Scan Converter (From NDI Tools https://ndi.video/tools/ ) and select Firefox from the Capture menu
Use NDISyphon to convert the NDI stream to Syphon. Use virtual audio routing (eg., black hole) for system audio, or route audio directly into Max.
In Max, use the Syphon client from the package manager to receive video. Here is an example using Vizzie abstractions described in previous posts:
notes:
NDI scan converter allows screen capture from apps and broadcasts in NDI format over the LAN. Then NDISyphon app converts NDI stream into Syphon. For youtube, use theatre mode to fill the window – full screen doesn’t really work. see this link: https://support.telestream.net/s/article/Wirecast-Remote-Computer-Screen-Capture-with-NDI-macOS
Mapping geocoded contest log data using node.js and openlayers.
The goal was to make something that looks like the Reverse Beacon Network map, only for contest log files. I use RBN for testing antennas now. That map display gives you a pretty good idea of your actual antenna pattern.
Code is written in node.js (javascript) and html.
Part 1: Read a Cabrillo log file containing QSO: records. Look up each callsign, get latitude and longitude, and rewrite the file as json data, tagged with geo coordinates. I originally tried getting the data from hamQTH but it was not current, so ended up using the qrz.com xml callsign lookup. For callsigns “not found” I used the qrz.com dxcc prefix lookup to get general coordinates for the country. There are still a few bad/missing data issues to resolve. Like European stations with coordinates at the South Pole.
Part 2: Tried various mapping frameworks – like leaflet, arcgis, and openlayers. Wanted to use a great-circle projection (azimuthal equidistant) like the big ARRL world map. And may still figure this out. But working with map projections and coordinate transforms is way worse than doing a Smith Chart. I ended up hacking a flight tracking example from openlayers.org and basically replacing airplanes with QSO’s. That is why the lines are animated from source to destination. Also added a layer for day/night, and QSO/time status display.
It probably makes sense to get rid of the flight animation and just display the entire path in sync with the QSO data – with color code for each band (K1KP) – and speed control on the time lapse, etc., So you can get a better sense of rate and propagation.
It would be cool to have a website where you could upload a log file and generate maps.
Note: this project is not yet available
Files
local files:
generating data:
internetsensors/cabrillomap
put the cabrillo data in testdata.cbr (use QSO: records only for now) should be sorted chronologically.
run: node index.js
the output file will be: geocab.json (which is used as input to the mapping program)
mapping
internetsensors/oltest
main.js = node source with ol mapping and data processing
index.html = web page for map
geocab.json = geocoded cabrillo json test data
to run, type: npm start
Then open: http://localhost:5173/ in a browser
Additional work / current issues
Some of the qrz.com callsign data has bad geo coordinates. In particular some of the records show a latitude of -89 and longitute -179 – need to check for these numbers and replace with dxcc coordinates.
There should be an argument on the node program to pass in the datafile. Also the program should clean up any non QSO: records, like the file header info and any X-QSO recs.
Also need to clean up the async/await stuff – currently there are several methods for handling state transitions.
mapping ideas:
As mentioned above, its probably a good idea to make a version of the code without the flight animation, and have various controls to stop/start the data playback to look at individual qso’s do speed control, etc.,
azimuthal equidistant projection: there are some links to examples in leaflet, and arcgis to handle complex projections. In documents, look at: “map links for projection stuff.txt”
leaflet test version:
in the internetsensors/cabrillomap folder there’s a test file: cbworld1.html that works using websockets when you run the index.js file to generate test data. It uses a leaflet map, but the lines don’t adapt to great circle polar paths.
arcgis
I believe the arcgis examples are in internetsensors/projected geometries
And: internetsensors/pe-gs-projection
The former is a a very nice world projection with some point markers. The latter is an example that shows how to switch out various projections in realtime.
Using 40 meter SSB, Max/MSP, and Websdr to build a feedback delay from Maine to the Netherlands.
A Max patch plays an audio file into the transmitter in Maine. Then, using a websdr receiver in the Netherlands, the received signal is amplified and mixed back into the audio to be retransmitted. Effectively creating a feedback delay line.
Here is an example of what it sounds like on 40 meters.
The audio from Websdr is routed to the input of Max by creating a multi-output device (In Audio Midi Setup) combining “Blackhole 2ch” and external headphone output (for monitoring). The audio output of Max is assigned to the transmitter SSB input.
Patch not yet available. Local version is in max teaching examples.
Attempts to use the X API (formerly Twitter) for projects with Max/MSP have been disappointing at best. Most of the API is behind a paywall now. The cost is $5000 per month. to implement streaming API used in projects like this: https://reactivemusic.net/?p=5786
The free tier only allows basic tweeting and user lookup. Search is not available.
I was able to find only one node example that actually worked in the free tier. By “Coding with Ado”. The code requests a token from X, and then you enter a timestamped pin number to continue. Making it worthless for programs and bots. https://youtu.be/G5ZW5j5cwHk?si=vbAtGa0bQ3T_tga9
A local copy of the source code for this is in tkzic/nodetweet3/index.js
There service sits in the middle to handle X API calls. You are charged by the number of calls. It doesn’t offer streaming either, but you can simulate it by calling a search every few seconds.
Other social media options
There are API’s for other social sites like facebook, instagram, tiktok, etc.,
In the previous version, all of the markers for each train line were deleted and redrawn in a group at the polling interval. In this version the train markers move to their new location when the data is polled.
The project uses the MBTA JSON API, to query vehicle data for each train line. the geo coordinates of the trains are sent via websockets to the client map page.
This project is a work in progress and not available yet.
Files:
node server: internetsensors/mbtanode2/index.html
html map client: internetsensors/mbtanode2/mbtatrain2.html
Instructions
// to run, type node index.js at a command line.
// then open a web browser to: http://127.0.0.1:8124/
// type 'go' into the text box below the map, and press 'send' button
// in a few seconds the train markers will magically appear
I needed a text to morse code generator in Max for the Twitter streaming map project. There was an ancient one that used [mxj] but its kind of a pain to use that object. I thought it can’t be that difficult to write one? I didn’t really have any idea where to start. Something about the blank Max patch causes brain activity? I went through about 5 different approaches. Eventually came up with this pattern thing, from thinking about the lighted buttons on the tr-808 drum machine.
For example, the letter A is . _ (dot dash)
morse code has rules for spacing:
dot = 1
space between tones = 1
dash = 3
space between letters = 3
space between words = 7
If you think of a drum machine pattern, the pattern for letter A would be: 1 0 1 1 1 0 0 0 (with the 3 trailing 0’s for letter spacing)
I made a [coll] with all the letter patterns indexed by ascii codes.
Then just concatenate letter patterns, for a given block of text, together into one big list and run it through [zl.nth] clocked by a [metro] and [counter]. the ones and zero’s turn an oscillator on and off.
local file: tkzic/internetsensors/twitter-stream2/morse5.maxpat
Milford Graves was one of my all time favorite musicians. His approach to percussion, and music generally, was unique in a way that defies explanation.
I sampled a bunch of clips of his drumming into Ableton live and then experimented with the Buffer Shuffler 2.0 device to see if I could randomize small slices, ie., several seconds each, of longer samples – without losing the “texture” of the original recordings.
Here is an example of what it sounds like:
This video shows a clip from David Murray’s “Real Deal” running through Buffer Shuffler using slices only about 2-3 seconds in length. The slicing rate is just arbitrary, since there is no warping or specific clock pulse.
Local files: tkzic/aardvark/milfordgraves1 project/milfordgraves1a.als