Craft and publish engaging content in an app built for creators.
NEW
Publish anywhere
Post on LinkedIn, Threads, & Mastodon at the same time, in one click.
Make it punchier 👊
Typefully
@typefully
We're launching a Command Bar today with great commands and features.
AI ideas and rewrites
Get suggestions, tweet ideas, and rewrites powered by AI.
Turn your tweets & threads into a social blog
Give your content new life with our beautiful, sharable pages. Make it go viral on other platforms too.
+14
Followers
Powerful analytics to grow faster
Easily track your engagement analytics to improve your content and grow faster.
Build in public
Share a recent learning with your followers.
Create engagement
Pose a thought-provoking question.
Never run out of ideas
Get prompts and ideas whenever you write - with examples of popular tweets.
@aaditsh
I think this thread hook could be improved.
@frankdilo
On it 🔥
Share drafts & leave comments
Write with your teammates and get feedback with comments.
NEW
Easlo
@heyeaslo
Reply with "Notion" to get early access to my new template.
Jaga
@kandros5591
Notion 🙏
DM Sent
Create giveaways with Auto-DMs
Send DMs automatically based on engagement with your tweets.
And much more:
Auto-Split Text in Posts
Thread Finisher
Tweet Numbering
Pin Drafts
Connect Multiple Accounts
Automatic Backups
Dark Mode
Keyboard Shortcuts
Creators love Typefully
180,000+ creators and teams chose Typefully to curate their Twitter presence.
Marc Köhlbrugge@marckohlbrugge
Tweeting more with @typefully these days.
🙈 Distraction-free
✍️ Write-only Twitter
🧵 Effortless threads
📈 Actionable metrics
I recommend giving it a shot.
Jurre Houtkamp@jurrehoutkamp
Typefully is fantastic and way too cheap for what you get.
We’ve tried many alternatives at @framer but nothing beats it. If you’re still tweeting from Twitter you’re wasting time.
DHH@dhh
This is my new go-to writing environment for Twitter threads.
They've built something wonderfully simple and distraction free with Typefully 😍
Santiago@svpino
For 24 months, I tried almost a dozen Twitter scheduling tools.
Then I found @typefully, and I've been using it for seven months straight.
When it comes down to the experience of scheduling and long-form content writing, Typefully is in a league of its own.
Luca Rossi ꩜@lucaronin
After trying literally all the major Twitter scheduling tools, I settled with @typefully.
Killer feature to me is the native image editor — unique and super useful 🙏
Visual Theory@visualtheory_
Really impressed by the way @typefully has simplified my Twitter writing + scheduling/publishing experience.
Beautiful user experience.
0 friction.
Simplicity is the ultimate sophistication.
Queue your content in seconds
Write, schedule and boost your tweets - with no need for extra apps.
Schedule with one click
Queue your post with a single click - or pick a time manually.
Pick the perfect time
Time each post to perfection with Typefully's performance analytics.
Boost your content
Retweet and plug your posts for automated engagement.
Start creating a content queue.
Write once, publish everywhere
We natively support multiple platforms, so that you can expand your reach easily.
Check the analytics that matter
Build your audience with insights that make sense.
Writing prompts & personalized post ideas
Break through writer's block with great ideas and suggestions.
Never run out of ideas
Enjoy daily prompts and ideas to inspire your writing.
Use AI for personalized suggestions
Get inspiration from ideas based on your own past tweets.
Flick through topics
Or skim through curated collections of trending tweets for each topic.
Write, edit, and track tweets together
Write and publish with your teammates and friends.
Share your drafts
Brainstorm and bounce ideas with your teammates.
NEW
@aaditsh
I think this thread hook could be improved.
@frankdilo
On it 🔥
Add comments
Get feedback from coworkers before you hit publish.
Read, Write, Publish
Read, WriteRead
Control user access
Decide who can view, edit, or publish your drafts.
The batch size is one of the most influential parameters in the outcome of a neural network.
Here is everything you need to know about the batch size when training a neural network:
Gradient Descent is an optimization algorithm to train neural networks.
The algorithm computes how much we need to adjust the model to get closer to the desired results on every iteration.
Here is how it works in a couple of sentences:
We take samples from the training dataset, run them through the model, and determine how far away our results are from the ones we expect.
We use this "error" to compute how much we need to update the model weights to improve the results.
A critical decision we need to make is how many samples we use on every iteration.
We have three choices:
• Use a single sample of data
• Use all of the data at once
• Use some of the data
Using a single sample of data on every iteration is called "Stochastic Gradient Descent" (SGD.)
The algorithm uses one sample at a time to compute the updates.
Advantages of Stochastic Gradient Descent:
• Simple to understand
• Avoids getting stuck in local minima
• Provides immediate feedback
Disadvantages:
• Computationally intensive
• May not settle in the global minimum
• The performance will be noisy
Using all the data at once is called "Batch Gradient Descent."
After processing every sample, the algorithm takes the entire dataset and computes the updates.
Advantages of Batch Gradient Descent:
• Computationally efficient
• Stable performance (less noise)
Disadvantages:
• Requires a lot of memory
• May get stuck in local minima
Using some data (more than one sample but fewer than the entire dataset) is called "Mini-Batch Gradient Descent."
The algorithm works like Batch Gradient Descent, the only difference being that we use fewer samples.
Advantages of Mini-Batch Gradient Descent:
• Avoids getting stuck in local minima
• More computationally efficient than SGD
• Doesn't need as much memory as BGD
Disadvantages:
• New hyperparameter to worry about
We usually call this hyperparameter "batch_size."
We rarely use Batch Gradient Descent in practice, especially with large datasets.
Stochastic Gradient Descent (using a single sample at a time) is not that popular either.
Mini-Batch is the most used.
There's a lot of research around the optimal batch size.
Your problem differs from any other problem, but empirical evidence suggests that smaller batches perform better.
(Small as in less than a hundred or so.)
Here is a good paper talking about recommendations for the batch size:
arxiv.org/abs/1206.5533
They recommend using 32 as a good default value.
A few notes:
• Training time: 263s, 21s, and 8s, respectively.
• Testing accuracy: 0.62, 0.77, 0.78, respectively.
• Attached images show the noise in the testing loss.
Noticeable differences!
Every week, I break down machine learning concepts to give you ideas on applying them in real-life situations.
Follow me @svpino to ensure you don't miss what's coming next.