If you start the data with the letter @, the rest should be a file name to read the data from, or - if you want curl to read the data from stdin. Multiple files can also be specified. Posting data from a file named 'foobar' would thus be done with -d, --data @foobar. When --data is told to read from a file like that, carriage returns and newlines will be stripped out. If you don't want the @ character to have a special interpretation use --data-raw instead.
Depending of your HTTP endpoint, server configuration, you should be good by using this format:
Sounds like you want to wrap the content of input in a JSON body, and then have that sent over with a POST request. I think that the simplest way to do that is to manipulate stdin first and then push that over to curl using -d @-. One way could look like this:
Which drops you into cat where you can input the data, directly, e.g. Shift + Insert in your terminal. You finish with a newline and a Ctrl + D which signals to cat that you're done. That data is then passed to curl, and you have a reusable history entry.
# Create the input file
echo -n 'Try 😁 and " to verify proper JSON encoding.' > file.txt
# 1. Use jq to read the file into variable named `input`
# 2. create the desired json
# 3. pipe the result into curl
jq -n --rawfile input file.txt '{"title":"mytitle", $input}' \
| curl -v 'https://httpbin.org/post' -H 'Content-Type: application/json' -d@-
Output:
...
"json": {
"input": "Try \ud83d\ude01 and \" to verify proper JSON encoding.",
"title": "mytitle"
},
...
Notice that the contents of the input file was properly escaped for using as a JSON value.
jq options used:
--null-input/-n:
Don’t read any input
--rawfile variable-name filename:
This option reads in the named file and binds its contents to the given global variable.