I wrote a small program to browse folders in the terminal. The main inspiration was type-ahead search in GUI file managers. There exist several programs that are similar (see the listing in the README), but none of them do it quite the way I like, and often they have a very complex UI and a ton of features. I tried to make something that is obvious how to use and gets out of your way. (I also wanted an excuse to learn Rust.)
That's mentioned in the readme! :) I'm pretty happy with the name choice, it's short, doesn't seem to clash with other CLI tools, it can be interpreted as "terminal explorer", and has a kind of cute meaning as you say.
Note that setting up tere requires defining an alias in your shell config anyway, so you can choose any name you want with basically zero extra effort! Except for coming up with an abbreviation you remember, of course. But I agree, it is faster to type with alternating hands.
Some Rust dependency bloat is pretty common, because a lot of idiomatic approaches are hidden behind dependencies. I don't think it's particularly crazy in this situation. The actual dependencies directly used are relatively lean:
crossterm = "0.24.0"
dirs = "4.0.0"
regex = "1.5.4"
serde_json = "1.0"
serde = { version = "1.0", features = ["rc"] }
textwrap = "0.14"
unicode-segmentation = "1.7"
I guess one could write their own textwrap and probably also dirs, but other than that they all are probably vital to this tool.
The only reason this wouldn't be a problem in another language would be because that language already includes the functionality of such libraries
I can easily look at each one of these and understand why it’s there, save only a few: most of crossterm’s recursive dependencies, which are fairly involved but the immediate dependencies at least are clearly explained in its README (and you could probably remove one or two of them at minor performance or similar costs); mio’s use of log, which I believe should be optional and I would say not enabled by default, but I’d rather use it with log than miss out on it, where it’s useful; uses of autocfg, which are build-time implementation detail for taking advantage of rustc features where available and readily understood on analysis; and I was going to say cfg-if, which I have a minor personal vendetta against due to extensive unnecessary use, but after looking at parking_lot_core’s src/thread_parker/mod.rs I will begrudgingly admit this is one of the rare cases where it is fairly well-justified. All up, none of the dependencies are in any way unreasonable.
Having over fifty dependencies for something like this is not in any way terrifying—you’re just starting with a different set of expectations and understandings of how things are done. It’s a natural and reasonable outworking of (a) using the right tools for the job, (b) doing things properly rather than half-heartedly (most obviously where terminal interactions are involved), and (c) Rust’s deliberately thin standard library.
atoi is a classic example of the sort of bad naming C has long been infamous for. It's not a word, it's an initialism, and the common documentation for it doesn't even say what the initialism actually means. It's relatively easy to infer what it means from the description of what it does (ascii to integer), but can easily be incorrectly inferred ([character] array to integer.)
C kind of gets a pass for this stuff because C was old and in the early days they were limited by token size. But to still be using this term today in a new language seems fairly derisible.
In fairness, C standard library functions mostly look like magic incantations to me (I think due in large part to the old-school convention of abbreviating function names). malloc, sprintf, atoi, etc don't exactly read like english... (unless you happen to already know what they stand for)
I don't know much about Rust, but I have a hard time believing python's standard library wouldn't have all the required functionality covered (whether performance would be adequate is different question). I for one, really really like python for its comprehensive standard library.
Indeed, Python would be a good language to implement this in terms of ease of development, but it's very difficult to distribute a standalone binary (which I wanted to do). The built-in curses support of Python is also not cross-platform I think.
It’s a funny thing I’ve noticed about scripting languages: they’re generally easier to get going with yourself, but they’re horrible for distribution/deployment, and if you have to integrate code written in other languages (even C libraries with Python bindings, or similar—things like wxWidgets or GTK), that rapidly escalates to a nightmare. Meanwhile, ahead-of-time compiled languages like Rust and Go are simply a breeze to distribute/deploy.
That's true, although I think in the case of Go, it is a central design decision to make single-binaries easy so it's more of an exception to the rule.
I don't think it's an inherent feature of scripting languages that they are hard to distribute. I'm pretty sure it's possible to package up a tiny Lua interpreter (or e.g. QuickJS) and all necessary scripts into a standalone file.
There are several working options for packaging Python apps, some existing for twenty years. Sure, while a little more complicated than compiling a static executable, it is hardly a "nightmare." It's basically writing a config file, and adding another stanza to a Makefile or modern equivalent.
Reminds me of the idea regularly pushed here that you need a virtualenv even for thirty-line scripts. I read these kind of takes here often and am a bit baffled by them. Maybe it is because developers have lost administrator skills over time, that this feels like an insurmountable challenge?
That static executable will basically contain the whole python interpreter with a huge standard library. Maybe makes sense for a gui app, but I'd avoid installing a whole python interpreter for each of my little cli tools.
Don't forget the startup time overhead of first loading a whole interpreter into memory, then loading a python program into the interpreter.
There are multiple options for these requirements as well. I understand that solutions are sometimes clumsy, but the end-user won't know the difference.
I’ve worked with several of those ways of distributing Python apps over the years, mostly under Windows with a little under Linux and macOS, and mostly around a decade ago now, though I’ve touched a very little non-Windows stuff in the last year or two. They’ve consistently been more work than they seem even if all your stuff is pure Python, and there are just caveats left, right and centre. All kinds of stuff that should be fundamental and built-in was just extra effort at every step. Hooking up resources in the .exe file, controlling the Windows subsystem stuff, adding a manifest that works, figuring out which MSVCRT redistributables you need and how to hook them up (and you can’t pretty much just couldn’t automate this in any way), finding what library files (.dll, .so, whatever) you need because of dynamic linking, fixing library.zip stuff that just mysteriously didn’t work for some libraries, special-casing stuff here and there for C dependencies, tweaking optimisation and compression levels to try to speed things up and shrink them since startup is unreasonably slow and the distribution unreasonably large but these tweaks are also inconsistent and often break things, adding files your program depends on in a place where they’ll work, tweaking your code because it runs differently under py2exe or py2app or pyinstaller or whatever, trying to figure out why what you’ve built just isn’t even starting on so-and-so’s machine…
Granted, a few of these things apply to Rust stuff as well (e.g. resources and manifests can’t quite be done out of the box without extra tools), but most of them are inapplicable, and the remainder tend to have better solutions than I observed in Python-land. And a lot of the pain that I’m describing of the Python stuff isn’t that it’s hard to do anything, but more that I’ve found it all just exceptionally error prone and unreliable.
Yes, Rust has a very different philosophy to the standard library. As soon as something enters the standard library its API is basically set in stone, since there's no versioning for the standard lib. As a consequence, Rust choses to keep a lot of "must-have" functionality in crates, where it can evolve and compete. (rather than just adding more and more HTTP clients to the standard lib like Python did :) )
Python is "batteries included", Rust isn't. Different design choices entirely.
You can do a lot with pythons stdlib, and a lot of popular libraries (requests, for example) largely are just usability wrappers for underlying stdlib stuff.
Whereas with Rust, you have a very minimal std, and things you want to achieve are imported specifically.
No, actually. Of the direct dependencies, Python’s standard library has equivalents of regex (re), serde/serde_json (json), clap (argparse) and the boring half of textwrap (textwrap, which is only suitable for ASCII, and possibly curses providing a buggy and incomplete version of Unicode- and column-awareness, but I’m not sure of even that much, and I don’t think it’s properly cross-platform either), and some parts of crossterm (curses, of uncertain availability and reliability). But it completely lacks equivalents of most of crossterm and textwrap, and all of dirs and unicode-segmentation, providing in the latter two cases only functionality pretty much exactly equivalent to what Rust’s standard library contains.
This is the sort of thing that I meant by “doing things properly rather than half-heartedly”. You could make something like this by reimplementing half the libraries in an inferior fashion, full of bugs, cross-platform inconsistencies and missing functionality. Or you can take an extra dependency. Even in Python, the recommendation would be firmly to add dependencies like appdirs for dirs and I dunno what for the rest.
I tried to avoid bloat as much as possible, and I would argue that the non-transitive deps are pretty essential. I'l look into tweaking some features as suggested in sibling to trim it further down.
But you're right that it could always be simpler, in fact I wrote tere originally in C with curses as the only dependency, and it compiles >10x faster. But there I had to manually write some (pretty certainly buggy) unicode handling, and I think adding extra features (proper arg parsing, json for history file etc) would be way more painful.
I wonder, have you considered D? Seems almost ideal to me for this kind of small tooling, and it's unlikely to break because of dependency version changes caused by the thirty-seventh party.
Not really. The main reason for choosing Rust was to try it out, because it's the new cool thing. And I do find it really enjoyable. Rust is also trying very hard to keep dependency-related breakage to a minimum, and I think the Cargo ecosystem is doing a great job at that.
Fair enough; "I wanted to try out X" is always a valid reason. It's just that compared to C, things like Unicode support, argument parsing and JSON support are already part of D's standard library so no dependencies should be required for those.
Rust is also unlikely to break because of dependency version changes caused by the thirty-seventh party. Despite similarly large dependency trees, the Rust ecosystem takes backwards compatibility much more seriously than e.g. the JavaScript ecosystem.
I mean sure, but it means you have to maintain that functionality yourself. Often DIY implementations are not as robust as the versions in battle tested libraries with lots of eyes. So you can easily end up with more problems overall.
> unlikely to break because of dependency version changes caused by the thirty-seventh party.
With rust you just check in your Cargo.lock file to your VCS and then the versions of your dependencies and their dependencies (etc) are pinned, so if it works now then it will always work. For dependencies on crates.io authors don’t even have the option to remove a version once it’s published.
Upgrading dependencies is an explicit operation, so you only do it when required and run your tests afterwards.
In practice I’ve had far fewer issues with dependency versions using rust than I’ve had using dynamic system libraries in the C/C++ ecosystem
I disagree with "bloat" comments. First, the binary will only have stuff you use. Secondly, the fact that you are not writing from scratch code that someone else had more time to write and test - this is a huge plus. I'd use your implementation any day vs. something that someone could put together in C with no dependencies and remarkable pointer math equilibristics.
As long as the use of dependencies remains reasonable, the number of dependencies does not immediately mean that the code is "bloated".
The "too many dependencies" meme is far too often an example of Chesterton's Fence.
The answer to your question "How come a simple thing is more complex than I first thought?" is: "Because you've spent less time thinking about it than the author".
This isn't true, though. If this had been an older GNU utility, the dependencies likely would have been libterm and libc. Not because it would have been less complex, but because of the ecosystem choice in C applications to have larger, fully-featured libraries versus the micro-libraries Rust seems to have inherited from JavaScript conventions.
Whether this is better or worse seems to be a matter of contention for a lot of people, but it certainly poses a challenge for supply-chain auditing. I suspect for Rust to truly ever replace C++ in its domain, the ecosystem is going to need to come up with something like boost so you can grab one library that does all the things nearly any program wants (regex, serialization, cli opts, etc) but that don't get included in the core language's stdlib.
Musl is liked since it has a non-copyleft license and thus can be used to produce statically-linked executable that are distributed with proprietary licenses. Also Musl is smaller than GLibc thus more suitable for embedded systems.
Anyway, GLibc is just better, both in terms of performance (algorithms are more optimized, at the cost of a bigger executable size) and in terms of available features.
> Musl is liked since it has a non-copyleft license
Not only for that. As much as I love copyleft, I'm really happy that musl exists so that I can easily test my programs locally for portability. Compiling stuff with different compilers and libraries exposes at once all sorts of weird bugs before you get to publish your codes.
I don't think you can cite Chesterton's Fence for something that is 5 minutes old and lightly used. The Chesterton's Fence meme is far too often an excuse for baselessly dismissing criticism.
Well, that's Rust for you. They somehow think that having a boatload of dependencies with nondescript names (smawk, parking_lot_core, serde, clap - I feel like I'm having a stroke) is a feature. I'm not even joking.
You're free to create your own project with your own name and have it compete on the free market for dominance.
I don't get this criticism at all. Sure, it could be simpler, but it's A. Not your project, so who are you to say how to name it and B. Rust developers clearly understand what they're used for considering they're popular enough to be brought up here.
And unless you're actively developing in Rust, why would some other people's choice of naming their projects in a language and tool chain you don't ever plan on using be such a big deal?
You can say this about literally any criticism ever. I hate it and it blows chunks. It is the ultimate libertarian way of striking down any discussion: do not criticise, even in any small way, because it's a free market and you can just make another one, so never EVER even THINK of discussing anything.
I'm not saying that there should be some kind of totalitarian regime that enforces naming, I'm simply saying that yeah it kinda sucks that Rust devs tend towards cutesy names over ones that convey functionality.
This just seems like you're getting weirdly offended for no reason. Not everything is some kind of attack on individual agency. There is absolutely nothing wrong with me remarking on the state of Rust naming. The fact that you are conflating it to be a "big deal", which I never said it was, just seems like bad faith arguing me. This is just not something to get so offended over.
> I'm simply saying that yeah it kinda sucks that Rust devs tend towards cutesy names over ones that convey functionality.
Are you using Rust? If not, why does it matter what they choose to name their libraries. This is my entire point, you're coming in as an outsider criticizing something that is clearly not a problem to Rust devs.
If actual Rust developers are using them to the point where they're popular enough to gain (unwarranted, imo) criticism, then clearly they're working, silly name and all.
It's almost as if people care more about what the function a package provides rather than its name.
I'm not an outsider, I have used rust before, though not extensively, not that this matters at all. An active rust developer has no more permission to criticise something about the rust ecosystem than someone new to rust
But it does matter. If the people actively using the packages aren't complaining, then there isn't really a problem.
You've hyperfocused so much on my "criticism of your criticism" and the permissibility behind it that you chose to ignore the actual criticism I laid down.
I'm not continuing this either way since I've said my piece. Maybe step back and actually read my comments rather than focusing on me criticizing your critique.
Also, pot calling the kettle black. You spent an entire comment being offended over my critique.
You probably shouldn't make such inflammatory comments on an account that is directly linked to your professional and personal profiles, it really doesn't look great
Also it isn't a criticism of my criticism, it is an assertion that I should not be allowed to criticise at all, which is a totally different thing. Of course I am going to call that out
EDIT: I can't reply because rate limited
Please calm down. I'm not threatening you. I'm simply surprised that you make such inflammatory and aggressive comments on an account that is so closely tied to your professional life, that is all. I think you should reconsider doing so
Oh so you jump to implicitly threatening those you disagree with? Wow. At least you made it clear how open you are to discussion and debate.
I never said you can't criticize. As the other commenter already said, you're allowed to critique but don't be surprised when your criticism is then critiqued itself. That's the epitome of modern discussions and debate.
I gave you examples of what you can do right now, aka making your own alternative packages, but you chose to ignore that all and double down on "don't you dare critique my criticism".
And then (I'm going to point this out again because it's so absurd) vaguely implied my comments would face professional retribution. Because I chose to disagree with you and make that public.
In any case, I said I'm stopping and I am. You should too, as you're already crossing several lines with this comment.
Why the hell one has to use Rust to comment on library name? Is there some license requirement for comments on Rust stuff? It's not like that people go on GitHub and tell authors that their project names are silly and need to be changed.
This is exactly cultish thing about Rust, that unless someone is praising Rust, they are free to keep their mouth shut.
That's not really what's going on here though, because it isn't counter criticism, or a response to the criticism, but is instead simply "you shouldn't get to criticise"
You're saying it's "a common term", but I'd literally never heard it before a Rust library with this exact name appeared. Isn't this by any chance circular reasoning? That it became "a common term" because of this library? In that case you couldn't justify the name of the library by saying that it was "a common term" at that time.
My $0.02 - I've heard of it and I've never worked in Rust, though calling it "a common term" seems like a bit of a stretch - I was over 5 years into my career before I encountered it. It's much less common than analogous terms like modem ("modulator-demodulator").
Approximately everything, but on its own nothing. Serde provides the framework and macros/annotations, serde_json is the "backend" implementing the actual json serialization.
Sure, it could be named serde_framwork and serde_json, but I think serde and serde_json isn't that bad either.
Snottily states that the Rust community loves having too many dependencies and that the cutesy names are stroke-inducing. And you think the replies to that are aggressive?
The comment would probably do better if it was phrased more like “Rust projects usually have lots of dependencies, and a lot of them have unhelpful names, and this is bad because XYZ”. This comment isn’t trying to offer constructive criticism or add to the discussion, it is just mocking rust and its users. The tone makes me a lot more likely to downvote it and move on rather than engaging in discussion.
For context, I totally agree that rust often ends up with more dependencies than I would like and that crate names are often super unhelpful, but I downvoted the comment.
They cannot be removed without good reason (like “here’s a letter from my lawyer” or “this code secretly exfiltrates credentials”), and a lock file means that even if they change, your usage of them won’t until you ask to upgrade versions.
The deps for each of these packages is really out of the authors hands, this is really the same for any programming language though which need to pull in dependencies of their own.
That's true, but OP did ask a question "How come...". Your answer is that the Rust ecosystem is to blame for encouraging micro-dependencies, like Node.js, and not the author. I think that's a pretty reasonable take! But it doesn't mean that there's not a problem.
It's true that vim probably contains all of the functionality of tere, but IME browsing with vim is still a bit more cumbersome (and I'm not sure if you can configure it to print the cwd on exit, though probably you can). The extra '/' keystroke in each subdirectory for type to search, and the auto-cd really does make tere feel smoother.
Also netrw is a bit of a mess; I once found a bug and thought "I'll write a patch". After five minutes of browsing through the code ([insert Eddie Murphy meme]): "yeah, never mind".
Yeah, I'll admit that the basic idea is not novel (I have a list of similar programs in the README). But I bet your tool doesn't work exactly the way I like! :)
I would say that terminal explorer is what I had in mind initially (and incidentally, I also speak Finnish, heh). But a misspelling of tree is certainly a valid interpretation as well :)
Tab completion with these settings takes some extra keystrokes: if there's only one match, you have to press tab twice to see what's in the subfolder. If you have several matches, you have to press tab, then enter to select the one you want, and then tab again to show the subfolder contents. And going up (especially several) folders is also quite a bit of typing. It might sound like splitting hairs to shave off less than five keystrokes, but it really does make a difference in how smooth the browsing feels.
In the demo, how did you get the key press indicator to show? I haven't seen that before, but it's a wonderful addition. And may I ask what you used to record it?
Echoing neighbor comments that the README is very well done. I specifically searched through this thread to learn how you displayed those keystrokes in the demo.
I was thinking the same. Building something without sharing is one thing. Dumping code on GitHub is second level. Third is building a good README. Fourth is building a good README and announcing it to the right audience.
Very well executed.
I'm not sure there's a level between fifth and "being obnoxiously pushy," but if there's such a thing, I'd be interested to know. :)
This feels a lot like Vim's built in file explorer (netrw).
I find these kinds of text based, keyboard centric explorers to be far superior for navigating around a codebase, then giving you back your screen space as soon as you're found what you're looking for.
I hit ~ expecting to go to my home directory, which was a bit odd
Once I select a directory I want to dump myself back out into a command line in that directory, I couldn't see the key for that? Or am I misunderstanding the purpose?
To end up in the directory where you are, you just have to exit using esc or alt-q, assuming you have set up your shell config.
I'll add ~ as a shortcut for the home directory, thanks for the good idea! (It conflicts with type-to-search for folders starting with '~', but I think that's much less common than cding to home.)
I strongly dislike the idea of automatically jumping into a folder when there’s only one match: it feels like a feature optimised for very slow typers that actively penalises fast typers, because it makes things generally unpredictable: will one character be enough to navigate, or two, or three? It depends on the siblings, and if you type more characters than are required, you will be penalised by ending up somewhere else as the remaining letters are fed to the next level down that you haven’t even seen yet. My general habit in navigating is to type three characters and press Tab, because that’s almost always sufficient, and is plenty fast. I have a very few places where I’ve learned to do other things. Under my habit, I could end up as much as two levels down from where I wanted to be.
I'm a fast typer, and I understand where you're coming from! But note how there's a small (200ms by default but can be configured) delay when the auto-cd happens. This is actually pretty important, because otherwise it's impossible to see where you cd'd to. Importantly, during this delay, the keystrokes are eaten (so they are not fed down to the next level), and from my experience this is enough to basically never accidentally cd in the lower level.
The auto-cd can also be turned off with a CLI option.
I think you were highlighting an important potential usability issue; the delay might be ok, but I still prefer a keystroke, such as enter, to confirm selection.
Eh, I get what avgcorrection was going for—unintentionally, incidentally. Depending on the angle you look at it, it could count as both direct and indirect. I’ll stand by “actively”.
I have the "automatically jump to a folder" setting turned on in my editor's file browser (Emacs with Helm), and it works much better than you're predicting - I rarely end up more folders down than I am shooting for, I don't need to invest much mental energy into prediction, and if I make a mistake, navigating up again is a single chord.
I think you're underestimating how efficient your brain can get at guesstimating how many characters are required to navigate to a directory, especially when the directory structure is familiar (at which point I virtually never overshoot).
And, I still find value in this feature despite being a fast typist (90 WPM).
Is cd+ls a common pattern for people? I tend to just use double tab to browse folders, combined with a file separator-aware "delete word" keybinding when I want to go back up one level.
Same here, I use ls when I'm where I want to be, but for navigating the filesystem to there, I use cd and TAB TAB autocompletion (which is smart enough to only suggest directories).
I've got ZSH or a ZSH plugin that shows me a list of possible files in a folder I'm cd-ing into, without going into a different 'terminal mode'. Works for things like vim as well.
This is the best reason ever, and not just for you: seeing your GIF I thought "could I achieve that with fzf?". Turns out, yes I can:
function fcd() {
local dir;
while true; do
# exit with ^D
dir="$(ls -a1p | grep '/$' | grep -v '^./$' | fzf --height 40% --reverse --no-multi --preview 'pwd' --preview-window=up,1,border-none --no-info)"
if [[ -z "${dir}" ]]; then
break
else
cd "${dir}"
fi
done
}
Certainly not the best code (and definitely not a jab at your implementation!) but, hey, it was purely for the heck of it.
Lately I've been trying to find the joy in programming again, these kind of fun little challenges help a lot, so thanks for sharing, enthusiasm and creativity is contaminating :)
While digging up alternatives (of course after I had already written most of the functionality), I briefly tried out fzf. I think at that time I couldn't find an example snippet like yours to do the cding, so I didn't look into it much more. With some basic settings, it was also not easy (or even possible?) to go up in the folder tree, but I see that your example handles that.
If you're unfamiliar with a big folder tree, fzf (or another very similar tool that is designed for this purpose, broot) can be more efficient. But it might take a while for it to scan all subfolders.
> fzf (or another very similar tool that is designed for this purpose, broot) can be more efficient
Well, fzf is only a selection tool, it's more about the tool that feeds fzf data ;) That's why I used a dumb ls in my example. Maybe a difference with yours is that it does perform cd on each loop, and there's no way to bail out and not cd.
> it might take a while for it to scan all subfolders.
Haha yeah, for recursive cases I was using ripgrep as fzf default command:
Nice! exa (an attempt at a better ls) has a -D option, so then it is `exa -a1D` without the grep. I created a second version that also adds -R for searching the recursive directory tree with `exa -a1DR | grep : | cut -d ':' -f 1`
alias pf="fzf --preview='less {}' --bind shift-up:preview-page-up,shift-down:preview-page-down"
You can run `pf` (preview file) in a directory and it opens a split window with fzf where you can preview text files with less and optionally filter down which files are matched with fzf.
If you're using the fzf.vim plugin[0] you can run `:Commits` to do something similar to your command too. It adds a bit more detail such as when the commit was made in relative format and color codes the git diff.
FYI: I think you meant "contagious" rather than "contaminating". The latter means to make something impure whereas the former means more to spread something. :)
This is great, thank you! Just a couple of minor changes to handle softlinks to folders...
function fcd() {
local dir;
while true; do
# exit with ^D
dir="$(ls -a1F | grep '[/@]$' | grep -v '^./$' | sed 's/@$//' | fzf --height 40% --reverse --no-multi --preview 'pwd' --preview-window=up,1,border-none --no-info)"
if [[ -z "${dir}" ]]; then
break
elif [[ -d "${dir}" ]]; then
cd "${dir}"
fi
done
}
# For fuzzy-jumping down your home directory.
# See http://richardmavis.info/fuzzy-jumping
function fcd {
cache=~/.config/home-dirs-list
if (( $# == 0 )); then
if [ -e $cache ]; then
cd "`fzf --height=10 < ${cache}`"
else
echo "No directory cache."
cd `find ~ -type d | fzf --height=10`
fi
elif (( $# == 1 )); then
if [[ $1 == "--cache" ]]; then
echo "Making home directory cache."
find ~ -type d | sort > $cache
elif [ -e $cache ]; then
cd "`cat ${cache} | grep $1 | fzf --height=10`"
else
echo "No directory cache."
cd `find $1 -type d | fzf --height=10`
fi
else
echo "Usage: fcd [DIR]|--cache"
fi
}
Nice, looks like this is indeed very similar to tere! I guess the biggest difference is that the current search is not as "sticky", looks like it's cleared automatically after a while, which is similar to the behavior of the windows file explorer. I added it to the list of similar programs in the README (it's in the develop branch for now).
I was well prepared mentally for the HN cynicism, and I did spend quite a bit of mental energy to justify the existence of tere to myself. Luckily I have a good rebuttal: I already use vim! :)
Sorry for being dense but I thought when I entered into a directory and pressed Esc it would pwd into that directory. But in my case, entering the dir and pressing esc just prints the dir name and doesn't pwd into it. I did the cargo install.. What am I missing?
You need to configure a shell function to actually do the cd (see the snippets in the README). It has to be done this way because a subprocess cannot change the pwd of the parent.
Appreciate this is a hobby project to learn Rust in, but for those who want similar functionality without installing additional dependencies, you can already do this in `vim` by selecting a directory to open instead of a file.
For example
vim ~
vim /etc
It has more or less the same UI as the one in this Show HN but with the added feature of opening any files you select.
As mentioned in another comment, vim doesn't provide exactly the same experience: navigation is more cumbersome because you need extra keystrokes for type-to-search and to cd a folder. It also requires some extra tinkering for vim to print out the cwd when you exit.
Clearly it's not going to be functionally identical, which is why I said it might interest people who are after something similar "without installing additional dependencies".
That said, I do think your reply here is a little disingenuous because one extra key stroke is hardly "cumbersome". Especially when you advertise your tool as having VIM bindings and then criticise VIM for the same thing. I have no issues with you promoting your tool but lets be pragmatic about our comparisons here.
You also overlooked the fact that VIM will open any documents from within in. Which would save you a lot more key strokes than tere due to not having to `$EDITOR $file` after the whole (tere|vim) {navigate} process.
As for automatically changing directory. Lets be clear, `tere` doesn't either. You need to configure your shell to do that. Granted you provide the code to set that up but there's nothing stopping someone from writing a similar script for VIM (there's a VimLeave event so you could do something like
:autocmd VimLeave *
and the path would be stored in
expand('%:p:h')
so it's certainly doable to replicate the same behaviour in VIM. In fact I might even knock up a working script if anything wants it?
I'm honestly not saying any of this to be negative about your project here though. It's a nice little tool and I'm sure some people will find it very useful. It's just a common problem so you'll probably find a lot of of people have already solved this with other tools. But there's absolutely nothing wrong with having another tool out there :)
To me, there really is a significant difference in friction when navigating in vim vs tere. Maybe it's because I need to use shift to type '/' on my keyboard layout. But I also have to press enter twice to cd after searching.
The point about opening the files within vim is valid, and I have been considering adding the option to call xdg-open on highlighted files, like many such tools do. I haven't decided yet if that's within the scope of tere.
> It's just a common problem so you'll probably find a lot of people have already solved this with other tools
I agree, and indeed there are a lot of existing alternatives as mentioned in the README. The main motivation for me was to make something that works just the way I want, and secondarily, to write something fun in Rust.
xplr author here (xplr.dev). Nice tool. Note that there's a plugin https://github.com/sayanarijit/type-to-nav.xplr that does something similar. But I just realized that tere can also be used as a more generic type-to-nav helper for xplr.
Well, what I think... It looks cool, but fish already includes similar functionality in its directory completion, so no external program required. So I suspect this is useful for people using bash and such?
One thing I'd like is a cd command incantation that when given a file path, it just changes to the dir containing such file.
It is such a common need (for me at least) that I'm surprised the standard cd doesn't do it already.
I find myself using find or locate, then wanting to change to some file's directory, thus needing to manually delete the file's name before hitting enter. When you do that enough times you start to wish it worked automatically :-)
This reminds me of "ncdu" which has the main task to let you know where your disk space has gone, but you can also use it to navigate folders and even delete stuff.
ccd is more concerned with an edge-first search, and since it's Windows, actually changing the directory involves a thread insertion in the cmd process to do the work ... which somehow works.
OTOH, people have named a number of similar tools, and I'm going to point out 'mc'
aka Midnight Commander, which is part of most linux distro's, and is a clone of the 1980's Norton commander. Its designed to be a command line TUI helper, sorta like the project here, with not only cursor/etc based directory navigation but copy/rename/delete/etc functionality just a keystroke away. Its fairly convenient because it overlays the command line rather than replacing it.
Do people actually still use cd + ls? At least in bash, you get a similar experience just typing a cd command and tab-completing the path elements - bash shows a listing of matches when your command is ambiguous, no `ls` required.I'll still check it out as I can see fuzzy-search being more useful but muscle memory is hard to change.
One of my pet peeves is CLI newbies who always have the need to cd into some directory before running a command. I learnt pretty early on that this was stupid.
Just run your command from the repo root or $HOME.
To be fair most of those newbies are probably writing scripts that use relative directories so they require them being run in the same directory. Newbies tend to reinforce bad habits like that which makes it harder to break out of those ruts.
IMO it's easier just to write your scripts so they don't rely on the directory context for running correctly. Usually it's just a matter of taking your relative path using script and sticking something like this at the top and you're good.
imho that is not a bad habit to have, what's the harm? you don't always know how the script/application behaves. sometimes they do things in relation to where you are standing.
if you mean the newbies cd into a location in $PATH though, i.e.: cd /bin, cd /usr/bin etcetera I agree, that doesn't make sense usually
This is quite neat so I feel slightly embarrassed to share my low-tech solution. Just run a "dir" after each "cd" (which is my alias for "ls -l --si ...").
It's written for fish, but could be ported to bash easily:
# cat ~/.config/fish/functions/cd.fish
function cd
builtin cd $argv;
if test $status -gt 0 # there was an error, stop
return
end
# auto print dir info
if test "$argv" != "" # not home though
dir
end
end
There's probably a way to optimize out the "test " call as well, but I haven't got around to it.
Heh, I implemented almost exactly this same thing several years ago in Python[1], with the main difference just being the amount of polish and feature set. (I was, and continue to be, a very distractible person)
I wrote a small program to browse folders in the terminal. The main inspiration was type-ahead search in GUI file managers. There exist several programs that are similar (see the listing in the README), but none of them do it quite the way I like, and often they have a very complex UI and a ton of features. I tried to make something that is obvious how to use and gets out of your way. (I also wanted an excuse to learn Rust.)
Let me know what you think!