+
+ This program is free software: you can redistribute it and/or modify
+ it under the terms of the GNU Affero General Public License as published
+ by the Free Software Foundation, either version 3 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU Affero General Public License for more details.
+
+ You should have received a copy of the GNU Affero General Public License
+ along with this program. If not, see .
+
+Also add information on how to contact you by electronic and paper mail.
+
+ If your software can interact with users remotely through a computer
+network, you should also make sure that it provides a way for users to
+get its source. For example, if your program is a web application, its
+interface could display a "Source" link that leads users to an archive
+of the code. There are many ways you could offer source, and different
+solutions will be better for different programs; see section 13 for the
+specific requirements.
+
+ You should also get your employer (if you work as a programmer) or school,
+if any, to sign a "copyright disclaimer" for the program, if necessary.
+For more information on this, and how to apply and follow the GNU AGPL, see
+.
\ No newline at end of file
diff --git a/README.md b/README.md
index e0df49c..a50212c 100644
--- a/README.md
+++ b/README.md
@@ -2,14 +2,15 @@
## Completed Tasks
-- [x] Text results
-- [x] Image results
+- [X] Text results
+- [X] Image results
- [X] Video results
- [X] Map results
- [X] Forums results
-- [x] HTML+CSS site (no JS version)
+- [X] HTML+CSS site (no JS version)
- [X] Results cache
- [X] Torrent results
+- [X] Better name
## Pending Tasks
@@ -17,31 +18,40 @@
- [ ] JS applets for results (such as calculator)
- [ ] Dynamic results loading as user scrolls
- [ ] Replace fonts, replace icons font for SVG or remove unnecessary icons for faster loading
-- [ ] Better name
- [ ] LXC container
- [ ] Docker container
- [ ] Automatic updates
- [ ] Scalable crawlers and webservers + load balacing
+- [ ] LibreY-like API
+- [ ] Music results
-# Ocásek (Warp) Search Engine
+# Warp (Ocásek) Search Engine
-A self-hosted private and anonymous [metasearch engine](https://en.wikipedia.org/wiki/Metasearch_engine), that aims to be more resource effichent and scalable. Decentralized services are nice, but juming between instances when one just stops working for some reason is just inconvenient. So thats why this engine can do both, you can self-hoste it or use [officiall instance](https://search.spitfirebrowser.com/).
+A self-hosted private and anonymous [metasearch engine](https://en.wikipedia.org/wiki/Metasearch_engine), that aims to be more resource effichent and scalable than its competetion. Its scalable becouse switching instances when one just decides to not work/is under too much load is just inconvenient. So you can comfortably use [officiall instance](https://search.spitfirebrowser.com/), or self-host your own instance.
## Comparison to other search engines
-| Name | Works without JS | Privacy frontend redirect | Torrent results | API | No 3rd party libs | Scalable | Not Resource Hungry | Dynamic Page Loading |
-|------------|----------------------|---------------------------|-----------------|-----|-------------------|----------|---------------------------------------------|----------------------|
-| Whoogle | ✅ | ❓ Only host can set it | ❌ | ❌ | ❌ | ❌ | ❓ Moderate | ❓ Not specified |
-| Araa-Search| ✅ | ✅ | ✅ | ✅ | ❓ | ❌ | ❌ Very resource hungry | ❌ |
-| LibreY | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❓ Moderate | ❌ |
-| Ocásek | ✅ | ✅ | ✅ | ❌ | ✅ [1] | ✅ | ✅ about 20MiB at idle, 21MiB when searching| ✅ |
-[1]: It does not rely on 3rd-party libs for webscraping like [Selenium](https://www.javatpoint.com/selenium-webdriver), but it uses other search instalces like LibreX as fallback.
+| Feature | Whoogle | Araa-Search | LibreY | 4get | *Warp* |
+| :----------------------------------- | ------------------ | ------------------------- | ------------------------ | ------------------------ | ---------------------------------------------------- |
+| Works without JavaScript | ✅ | ✅ | ✅ | ✅ | ✅ |
+| Music search | ❓ | ❌ | ❌ | ✅ | ✅ |
+| Torrent search | ❌ | ✅ | ✅ | ❌ | ✅ |
+| API | ❌ | ✅ | ✅ | ✅ | ✅ |
+| Scalable | ❌ | ❌ | ❌ | ❌ | ✅ |
+| Not Resource Hungry | ❓ Moderate | ❌ Very resource hungry | ❌ Moderate 200-400mb~ | ❌ Moderate 200-400mb~ | ✅ about 15-20MiB at idle, 17-22MiB when searching |
+| Dynamic Page Loading | ❓ Not specified | ❌ | ❌ | ❌ | ✅ |
+| User themable | ❌ | ✅ | ❌ | ❌ | ✅ |
+| It has dildo as logo, unironically | ❌ | ❌ | ❌ | ✅ | ❌ |
+
+[1]: I was not able to check this since their site does not work, same for the community instances.
+
+[2]: In the project repo they specify that it has API, but It looks like they are no loger supporting it. Or just removed "API" button and documentation, since I was not able to find it anymore.
## Features
-- Text search using Google, Brave, DuckDuckGo and LibreX/Y search results.
-- Image search using the Qwant/Imgur.
+- Text search using Google, Brave, DuckDuckGo and LibreX/Y.
+- Image search using the Qwant, Bing and Imgur.
- Video search using Piped API.
- Image viewing using proxy and direct links to image source pages for image searches.
- Maps using OpenStreetMap
@@ -64,4 +74,8 @@ chmod +x ./run.sh
./run.sh
```
-*Its that easy!*
\ No newline at end of file
+*Its that easy!*
+
+## License
+
+[![](https://www.gnu.org/graphics/agplv3-with-text-162x68.png)](https://www.gnu.org/licenses/agpl-3.0.html)
\ No newline at end of file
diff --git a/agent.go b/agent.go
old mode 100644
new mode 100755
index 172b209..296b4e4
--- a/agent.go
+++ b/agent.go
@@ -1,355 +1,355 @@
-package main
-
-import (
- "encoding/json"
- "fmt"
- "io/ioutil"
- "math/rand"
- "net/http"
- "sort"
- "sync"
- "time"
-)
-
-type BrowserVersion struct {
- Version string `json:"version"`
- Global float64 `json:"global"`
-}
-
-type BrowserData struct {
- Firefox []BrowserVersion `json:"firefox"`
- Chromium []BrowserVersion `json:"chrome"`
-}
-
-var (
- cache = struct {
- sync.RWMutex
- data map[string]string
- }{
- data: make(map[string]string),
- }
- browserCache = struct {
- sync.RWMutex
- data BrowserData
- expires time.Time
- }{
- expires: time.Now(),
- }
-)
-
-func fetchLatestBrowserVersions() (BrowserData, error) {
- url := "https://raw.githubusercontent.com/Fyrd/caniuse/master/fulldata-json/data-2.0.json"
-
- resp, err := http.Get(url)
- if err != nil {
- return BrowserData{}, err
- }
- defer resp.Body.Close()
-
- body, err := ioutil.ReadAll(resp.Body)
- if err != nil {
- return BrowserData{}, err
- }
-
- var rawData map[string]interface{}
- if err := json.Unmarshal(body, &rawData); err != nil {
- return BrowserData{}, err
- }
-
- stats := rawData["agents"].(map[string]interface{})
-
- var data BrowserData
-
- if firefoxData, ok := stats["firefox"].(map[string]interface{}); ok {
- for version, usage := range firefoxData["usage_global"].(map[string]interface{}) {
- data.Firefox = append(data.Firefox, BrowserVersion{
- Version: version,
- Global: usage.(float64),
- })
- }
- }
-
- if chromeData, ok := stats["chrome"].(map[string]interface{}); ok {
- for version, usage := range chromeData["usage_global"].(map[string]interface{}) {
- data.Chromium = append(data.Chromium, BrowserVersion{
- Version: version,
- Global: usage.(float64),
- })
- }
- }
-
- return data, nil
-}
-
-func getLatestBrowserVersions() (BrowserData, error) {
- browserCache.RLock()
- if time.Now().Before(browserCache.expires) {
- data := browserCache.data
- browserCache.RUnlock()
- return data, nil
- }
- browserCache.RUnlock()
-
- data, err := fetchLatestBrowserVersions()
- if err != nil {
- return BrowserData{}, err
- }
-
- browserCache.Lock()
- browserCache.data = data
- browserCache.expires = time.Now().Add(24 * time.Hour)
- browserCache.Unlock()
-
- return data, nil
-}
-
-func randomUserAgent() (string, error) {
- browsers, err := getLatestBrowserVersions()
- if err != nil {
- return "", err
- }
-
- rand.Seed(time.Now().UnixNano())
-
- // Simulated browser usage statistics (in percentages)
- usageStats := map[string]float64{
- "Firefox": 30.0,
- "Chromium": 70.0,
- }
-
- // Calculate the probabilities for the versions
- probabilities := []float64{0.5, 0.25, 0.125, 0.0625, 0.03125, 0.015625, 0.0078125, 0.00390625}
-
- // Select a browser based on usage statistics
- browserType := ""
- randVal := rand.Float64() * 100
- cumulative := 0.0
- for browser, usage := range usageStats {
- cumulative += usage
- if randVal < cumulative {
- browserType = browser
- break
- }
- }
-
- var versions []BrowserVersion
- switch browserType {
- case "Firefox":
- versions = browsers.Firefox
- case "Chromium":
- versions = browsers.Chromium
- }
-
- if len(versions) == 0 {
- return "", fmt.Errorf("no versions found for browser: %s", browserType)
- }
-
- // Sort versions by usage (descending order)
- sort.Slice(versions, func(i, j int) bool {
- return versions[i].Global > versions[j].Global
- })
-
- // Select a version based on the probabilities
- version := ""
- randVal = rand.Float64()
- cumulative = 0.0
- for i, p := range probabilities {
- cumulative += p
- if randVal < cumulative && i < len(versions) {
- version = versions[i].Version
- break
- }
- }
-
- if version == "" {
- version = versions[len(versions)-1].Version
- }
-
- // Generate the user agent string
- userAgent := generateUserAgent(browserType, version)
- return userAgent, nil
-}
-
-func generateUserAgent(browser, version string) string {
- oses := []struct {
- os string
- probability float64
- }{
- {"Windows NT 10.0; Win64; x64", 44.0},
- {"Windows NT 11.0; Win64; x64", 44.0},
- {"X11; Linux x86_64", 1.0},
- {"X11; Ubuntu; Linux x86_64", 1.0},
- {"Macintosh; Intel Mac OS X 10_15_7", 10.0},
- }
-
- // Select an OS based on probabilities
- randVal := rand.Float64() * 100
- cumulative := 0.0
- selectedOS := ""
- for _, os := range oses {
- cumulative += os.probability
- if randVal < cumulative {
- selectedOS = os.os
- break
- }
- }
-
- switch browser {
- case "Firefox":
- return fmt.Sprintf("Mozilla/5.0 (%s; rv:%s) Gecko/20100101 Firefox/%s", selectedOS, version, version)
- case "Chromium":
- return fmt.Sprintf("Mozilla/5.0 (%s) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", selectedOS, version)
- }
- return ""
-}
-
-func updateCachedUserAgents(newVersions BrowserData) {
- cache.Lock()
- defer cache.Unlock()
- for key, userAgent := range cache.data {
- randVal := rand.Float64()
- if randVal < 0.5 {
- updatedUserAgent := updateUserAgentVersion(userAgent, newVersions)
- cache.data[key] = updatedUserAgent
- }
- }
-}
-
-func updateUserAgentVersion(userAgent string, newVersions BrowserData) string {
- // Parse the current user agent to extract browser and version
- var browserType, version string
- if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
- browserType = "Chromium"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 11.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
- browserType = "Chromium"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
- browserType = "Chromium"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Ubuntu; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
- browserType = "Chromium"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
- browserType = "Chromium"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
- browserType = "Firefox"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 11.0; Win64; x64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
- browserType = "Firefox"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Linux x86_64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
- browserType = "Firefox"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
- browserType = "Firefox"
- } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
- browserType = "Firefox"
- }
-
- // Get the latest version for the browser type
- var latestVersion string
- if browserType == "Firefox" {
- latestVersion = newVersions.Firefox[0].Version
- } else if browserType == "Chromium" {
- latestVersion = newVersions.Chromium[0].Version
- }
-
- // Update the user agent string with the new version
- return generateUserAgent(browserType, latestVersion)
-}
-
-func periodicUpdate() {
- for {
- // Sleep for a random interval between 1 and 2 days
- time.Sleep(time.Duration(24+rand.Intn(24)) * time.Hour)
-
- // Fetch the latest browser versions
- newVersions, err := fetchLatestBrowserVersions()
- if err != nil {
- fmt.Println("Error fetching latest browser versions:", err)
- continue
- }
-
- // Update the browser version cache
- browserCache.Lock()
- browserCache.data = newVersions
- browserCache.expires = time.Now().Add(24 * time.Hour)
- browserCache.Unlock()
-
- // Update the cached user agents
- updateCachedUserAgents(newVersions)
- }
-}
-
-func GetUserAgent(cacheKey string) (string, error) {
- cache.RLock()
- userAgent, found := cache.data[cacheKey]
- cache.RUnlock()
-
- if found {
- return userAgent, nil
- }
-
- userAgent, err := randomUserAgent()
- if err != nil {
- return "", err
- }
-
- cache.Lock()
- cache.data[cacheKey] = userAgent
- cache.Unlock()
-
- return userAgent, nil
-}
-
-func GetNewUserAgent(cacheKey string) (string, error) {
- userAgent, err := randomUserAgent()
- if err != nil {
- return "", err
- }
-
- cache.Lock()
- cache.data[cacheKey] = userAgent
- cache.Unlock()
-
- return userAgent, nil
-}
-
-func init() {
- go periodicUpdate()
-}
-
-// func main() {
-// go periodicUpdate() // not needed here
-
-// cacheKey := "image-search"
-// userAgent, err := GetUserAgent(cacheKey)
-// if err != nil {
-// fmt.Println("Error:", err)
-// return
-// }
-
-// fmt.Println("Generated User Agent:", userAgent)
-
-// // Request a new user agent for the same key
-// newUserAgent, err := GetNewUserAgent(cacheKey)
-// if err != nil {
-// fmt.Println("Error:", err)
-// return
-// }
-
-// fmt.Println("New User Agent:", newUserAgent)
-
-// AcacheKey := "image-search"
-// AuserAgent, err := GetUserAgent(AcacheKey)
-// if err != nil {
-// fmt.Println("Error:", err)
-// return
-// }
-
-// fmt.Println("Generated User Agent:", AuserAgent)
-
-// DcacheKey := "image-search"
-// DuserAgent, err := GetUserAgent(DcacheKey)
-// if err != nil {
-// fmt.Println("Error:", err)
-// return
-// }
-
-// fmt.Println("Generated User Agent:", DuserAgent)
-
-// }
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "io/ioutil"
+ "math/rand"
+ "net/http"
+ "sort"
+ "sync"
+ "time"
+)
+
+type BrowserVersion struct {
+ Version string `json:"version"`
+ Global float64 `json:"global"`
+}
+
+type BrowserData struct {
+ Firefox []BrowserVersion `json:"firefox"`
+ Chromium []BrowserVersion `json:"chrome"`
+}
+
+var (
+ cache = struct {
+ sync.RWMutex
+ data map[string]string
+ }{
+ data: make(map[string]string),
+ }
+ browserCache = struct {
+ sync.RWMutex
+ data BrowserData
+ expires time.Time
+ }{
+ expires: time.Now(),
+ }
+)
+
+func fetchLatestBrowserVersions() (BrowserData, error) {
+ url := "https://raw.githubusercontent.com/Fyrd/caniuse/master/fulldata-json/data-2.0.json"
+
+ resp, err := http.Get(url)
+ if err != nil {
+ return BrowserData{}, err
+ }
+ defer resp.Body.Close()
+
+ body, err := ioutil.ReadAll(resp.Body)
+ if err != nil {
+ return BrowserData{}, err
+ }
+
+ var rawData map[string]interface{}
+ if err := json.Unmarshal(body, &rawData); err != nil {
+ return BrowserData{}, err
+ }
+
+ stats := rawData["agents"].(map[string]interface{})
+
+ var data BrowserData
+
+ if firefoxData, ok := stats["firefox"].(map[string]interface{}); ok {
+ for version, usage := range firefoxData["usage_global"].(map[string]interface{}) {
+ data.Firefox = append(data.Firefox, BrowserVersion{
+ Version: version,
+ Global: usage.(float64),
+ })
+ }
+ }
+
+ if chromeData, ok := stats["chrome"].(map[string]interface{}); ok {
+ for version, usage := range chromeData["usage_global"].(map[string]interface{}) {
+ data.Chromium = append(data.Chromium, BrowserVersion{
+ Version: version,
+ Global: usage.(float64),
+ })
+ }
+ }
+
+ return data, nil
+}
+
+func getLatestBrowserVersions() (BrowserData, error) {
+ browserCache.RLock()
+ if time.Now().Before(browserCache.expires) {
+ data := browserCache.data
+ browserCache.RUnlock()
+ return data, nil
+ }
+ browserCache.RUnlock()
+
+ data, err := fetchLatestBrowserVersions()
+ if err != nil {
+ return BrowserData{}, err
+ }
+
+ browserCache.Lock()
+ browserCache.data = data
+ browserCache.expires = time.Now().Add(24 * time.Hour)
+ browserCache.Unlock()
+
+ return data, nil
+}
+
+func randomUserAgent() (string, error) {
+ browsers, err := getLatestBrowserVersions()
+ if err != nil {
+ return "", err
+ }
+
+ rand.Seed(time.Now().UnixNano())
+
+ // Simulated browser usage statistics (in percentages)
+ usageStats := map[string]float64{
+ "Firefox": 30.0,
+ "Chromium": 70.0,
+ }
+
+ // Calculate the probabilities for the versions
+ probabilities := []float64{0.5, 0.25, 0.125, 0.0625, 0.03125, 0.015625, 0.0078125, 0.00390625}
+
+ // Select a browser based on usage statistics
+ browserType := ""
+ randVal := rand.Float64() * 100
+ cumulative := 0.0
+ for browser, usage := range usageStats {
+ cumulative += usage
+ if randVal < cumulative {
+ browserType = browser
+ break
+ }
+ }
+
+ var versions []BrowserVersion
+ switch browserType {
+ case "Firefox":
+ versions = browsers.Firefox
+ case "Chromium":
+ versions = browsers.Chromium
+ }
+
+ if len(versions) == 0 {
+ return "", fmt.Errorf("no versions found for browser: %s", browserType)
+ }
+
+ // Sort versions by usage (descending order)
+ sort.Slice(versions, func(i, j int) bool {
+ return versions[i].Global > versions[j].Global
+ })
+
+ // Select a version based on the probabilities
+ version := ""
+ randVal = rand.Float64()
+ cumulative = 0.0
+ for i, p := range probabilities {
+ cumulative += p
+ if randVal < cumulative && i < len(versions) {
+ version = versions[i].Version
+ break
+ }
+ }
+
+ if version == "" {
+ version = versions[len(versions)-1].Version
+ }
+
+ // Generate the user agent string
+ userAgent := generateUserAgent(browserType, version)
+ return userAgent, nil
+}
+
+func generateUserAgent(browser, version string) string {
+ oses := []struct {
+ os string
+ probability float64
+ }{
+ {"Windows NT 10.0; Win64; x64", 44.0},
+ {"Windows NT 11.0; Win64; x64", 44.0},
+ {"X11; Linux x86_64", 1.0},
+ {"X11; Ubuntu; Linux x86_64", 1.0},
+ {"Macintosh; Intel Mac OS X 10_15_7", 10.0},
+ }
+
+ // Select an OS based on probabilities
+ randVal := rand.Float64() * 100
+ cumulative := 0.0
+ selectedOS := ""
+ for _, os := range oses {
+ cumulative += os.probability
+ if randVal < cumulative {
+ selectedOS = os.os
+ break
+ }
+ }
+
+ switch browser {
+ case "Firefox":
+ return fmt.Sprintf("Mozilla/5.0 (%s; rv:%s) Gecko/20100101 Firefox/%s", selectedOS, version, version)
+ case "Chromium":
+ return fmt.Sprintf("Mozilla/5.0 (%s) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", selectedOS, version)
+ }
+ return ""
+}
+
+func updateCachedUserAgents(newVersions BrowserData) {
+ cache.Lock()
+ defer cache.Unlock()
+ for key, userAgent := range cache.data {
+ randVal := rand.Float64()
+ if randVal < 0.5 {
+ updatedUserAgent := updateUserAgentVersion(userAgent, newVersions)
+ cache.data[key] = updatedUserAgent
+ }
+ }
+}
+
+func updateUserAgentVersion(userAgent string, newVersions BrowserData) string {
+ // Parse the current user agent to extract browser and version
+ var browserType, version string
+ if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
+ browserType = "Chromium"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 11.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
+ browserType = "Chromium"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
+ browserType = "Chromium"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Ubuntu; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
+ browserType = "Chromium"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36", &version); err == nil {
+ browserType = "Chromium"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
+ browserType = "Firefox"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Windows NT 11.0; Win64; x64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
+ browserType = "Firefox"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Linux x86_64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
+ browserType = "Firefox"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
+ browserType = "Firefox"
+ } else if _, err := fmt.Sscanf(userAgent, "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7; rv:%s) Gecko/20100101 Firefox/%s", &version, &version); err == nil {
+ browserType = "Firefox"
+ }
+
+ // Get the latest version for the browser type
+ var latestVersion string
+ if browserType == "Firefox" {
+ latestVersion = newVersions.Firefox[0].Version
+ } else if browserType == "Chromium" {
+ latestVersion = newVersions.Chromium[0].Version
+ }
+
+ // Update the user agent string with the new version
+ return generateUserAgent(browserType, latestVersion)
+}
+
+func periodicUpdate() {
+ for {
+ // Sleep for a random interval between 1 and 2 days
+ time.Sleep(time.Duration(24+rand.Intn(24)) * time.Hour)
+
+ // Fetch the latest browser versions
+ newVersions, err := fetchLatestBrowserVersions()
+ if err != nil {
+ printWarn("Error fetching latest browser versions: %v", err)
+ continue
+ }
+
+ // Update the browser version cache
+ browserCache.Lock()
+ browserCache.data = newVersions
+ browserCache.expires = time.Now().Add(24 * time.Hour)
+ browserCache.Unlock()
+
+ // Update the cached user agents
+ updateCachedUserAgents(newVersions)
+ }
+}
+
+func GetUserAgent(cacheKey string) (string, error) {
+ cache.RLock()
+ userAgent, found := cache.data[cacheKey]
+ cache.RUnlock()
+
+ if found {
+ return userAgent, nil
+ }
+
+ userAgent, err := randomUserAgent()
+ if err != nil {
+ return "", err
+ }
+
+ cache.Lock()
+ cache.data[cacheKey] = userAgent
+ cache.Unlock()
+
+ return userAgent, nil
+}
+
+func GetNewUserAgent(cacheKey string) (string, error) {
+ userAgent, err := randomUserAgent()
+ if err != nil {
+ return "", err
+ }
+
+ cache.Lock()
+ cache.data[cacheKey] = userAgent
+ cache.Unlock()
+
+ return userAgent, nil
+}
+
+func init() {
+ go periodicUpdate()
+}
+
+// func main() {
+// go periodicUpdate() // not needed here
+
+// cacheKey := "image-search"
+// userAgent, err := GetUserAgent(cacheKey)
+// if err != nil {
+// fmt.Println("Error:", err)
+// return
+// }
+
+// fmt.Println("Generated User Agent:", userAgent)
+
+// // Request a new user agent for the same key
+// newUserAgent, err := GetNewUserAgent(cacheKey)
+// if err != nil {
+// fmt.Println("Error:", err)
+// return
+// }
+
+// fmt.Println("New User Agent:", newUserAgent)
+
+// AcacheKey := "image-search"
+// AuserAgent, err := GetUserAgent(AcacheKey)
+// if err != nil {
+// fmt.Println("Error:", err)
+// return
+// }
+
+// fmt.Println("Generated User Agent:", AuserAgent)
+
+// DcacheKey := "image-search"
+// DuserAgent, err := GetUserAgent(DcacheKey)
+// if err != nil {
+// fmt.Println("Error:", err)
+// return
+// }
+
+// fmt.Println("Generated User Agent:", DuserAgent)
+
+// }
diff --git a/cache.go b/cache.go
index db982a8..b0b7a18 100644
--- a/cache.go
+++ b/cache.go
@@ -2,7 +2,6 @@ package main
import (
"fmt"
- "log"
"sync"
"time"
@@ -143,7 +142,7 @@ func (rc *ResultsCache) checkAndCleanCache() {
func (rc *ResultsCache) memoryUsage() float64 {
v, err := mem.VirtualMemory()
if err != nil {
- log.Printf("Failed to get memory info: %v", err)
+ printErr("Failed to get memory info: %v", err)
return 0
}
@@ -167,7 +166,7 @@ func (rc *ResultsCache) cleanOldestItems() {
if oldestKey != "" {
delete(rc.results, oldestKey)
- log.Printf("Removed oldest cache item: %s", oldestKey)
+ printDebug("Removed oldest cache item: %s", oldestKey)
} else {
break
}
diff --git a/common.go b/common.go
old mode 100644
new mode 100755
index 7d30861..3915f24
--- a/common.go
+++ b/common.go
@@ -1,17 +1,49 @@
-package main
-
-import (
- "html/template"
-)
-
-var (
- debugMode bool = true
- funcs = template.FuncMap{
- "sub": func(a, b int) int {
- return a - b
- },
- "add": func(a, b int) int {
- return a + b
- },
- }
-)
+package main
+
+import (
+ "crypto/rand"
+ "encoding/base64"
+ "html/template"
+ "strings"
+)
+
+var (
+ funcs = template.FuncMap{
+ "sub": func(a, b int) int {
+ return a - b
+ },
+ "add": func(a, b int) int {
+ return a + b
+ },
+ }
+)
+
+func generateStrongRandomString(length int) string {
+ bytes := make([]byte, length)
+ _, err := rand.Read(bytes)
+ if err != nil {
+ printErr("Error generating random string: %v", err)
+ }
+ return base64.URLEncoding.EncodeToString(bytes)[:length]
+}
+
+// Checks if the URL already includes a protocol
+func hasProtocol(url string) bool {
+ return strings.HasPrefix(url, "http://") || strings.HasPrefix(url, "https://")
+}
+
+// Checks if the domain is a local address
+func isLocalAddress(domain string) bool {
+ return domain == "localhost" || strings.HasPrefix(domain, "127.") || strings.HasPrefix(domain, "192.168.") || strings.HasPrefix(domain, "10.")
+}
+
+// Ensures that HTTP or HTTPS is befor the adress if needed
+func addProtocol(domain string) string {
+ if hasProtocol(domain) {
+ return domain
+ }
+ if isLocalAddress(domain) {
+ return "http://" + domain
+ }
+ return "https://" + domain
+}
diff --git a/config.go b/config.go
new file mode 100644
index 0000000..a411961
--- /dev/null
+++ b/config.go
@@ -0,0 +1,183 @@
+package main
+
+import (
+ "bufio"
+ "log"
+ "os"
+ "strconv"
+ "strings"
+ "sync"
+
+ "github.com/fsnotify/fsnotify"
+ "gopkg.in/ini.v1"
+)
+
+func initConfig() error {
+ if _, err := os.Stat(configFilePath); os.IsNotExist(err) {
+ return createConfig()
+ }
+
+ printInfo("Configuration file already exists.")
+ config = loadConfig()
+ return nil
+}
+
+func createConfig() error {
+ reader := bufio.NewReader(os.Stdin)
+
+ printMessage("Configuration file not found.")
+ printMessage("Do you want to use default values? (yes/no): ")
+ useDefaults, _ := reader.ReadString('\n')
+
+ if strings.TrimSpace(useDefaults) != "yes" {
+ printMessage("Enter port (default 5000): ")
+ portStr, _ := reader.ReadString('\n')
+ if portStr != "\n" {
+ port, err := strconv.Atoi(strings.TrimSpace(portStr))
+ if err != nil {
+ config.Port = 5000
+ } else {
+ config.Port = port
+ }
+ }
+
+ printMessage("Enter your domain address (default localhost): ")
+ domain, _ := reader.ReadString('\n')
+ if domain != "\n" {
+ config.Domain = strings.TrimSpace(domain)
+ }
+ } else {
+ config = defaultConfig
+ }
+
+ if config.AuthCode == "" {
+ config.AuthCode = generateStrongRandomString(64)
+ printMessage("Generated connection code: %s\n", config.AuthCode)
+ }
+
+ config.NodesEnabled = len(config.Peers) > 0
+ config.CrawlerEnabled = true
+ config.WebsiteEnabled = true
+ config.LogLevel = 1
+
+ saveConfig(config)
+ return nil
+}
+
+func saveConfig(config Config) {
+ cfg := ini.Empty()
+ sec := cfg.Section("")
+ sec.Key("Port").SetValue(strconv.Itoa(config.Port))
+ sec.Key("AuthCode").SetValue(config.AuthCode)
+ sec.Key("PeerID").SetValue(config.PeerID)
+
+ peers := strings.Join(config.Peers, ",")
+ sec.Key("Peers").SetValue(peers)
+ sec.Key("Domain").SetValue(config.Domain)
+ sec.Key("NodesEnabled").SetValue(strconv.FormatBool(config.NodesEnabled))
+ sec.Key("CrawlerEnabled").SetValue(strconv.FormatBool(config.CrawlerEnabled))
+ sec.Key("WebsiteEnabled").SetValue(strconv.FormatBool(config.WebsiteEnabled))
+ sec.Key("LogLevel").SetValue(strconv.Itoa(config.LogLevel))
+
+ err := cfg.SaveTo(configFilePath)
+ if err != nil {
+ printErr("Error writing to config file: %v", err)
+ }
+}
+
+func loadConfig() Config {
+ cfg, err := ini.Load(configFilePath)
+ if err != nil {
+ printErr("Error opening config file: %v", err)
+ }
+
+ port, err := cfg.Section("").Key("Port").Int()
+ if err != nil || port == 0 {
+ port = 5000 // Default to 5000 if not set or error
+ }
+
+ peersStr := cfg.Section("").Key("Peers").String()
+ var peers []string
+ if peersStr != "" {
+ peers = strings.Split(peersStr, ",")
+ for i, peer := range peers {
+ peers[i] = addProtocol(peer)
+ }
+ }
+
+ domain := cfg.Section("").Key("Domain").String()
+ if domain == "" {
+ domain = "localhost" // Default to localhost if not set
+ }
+
+ nodesEnabled, err := cfg.Section("").Key("NodesEnabled").Bool()
+ if err != nil { // If NodesEnabled is not found in config
+ nodesEnabled = len(peers) > 0 // Enable nodes if peers are configured
+ }
+
+ crawlerEnabled, err := cfg.Section("").Key("CrawlerEnabled").Bool()
+ if err != nil { // Default to true if not found
+ crawlerEnabled = true
+ }
+
+ websiteEnabled, err := cfg.Section("").Key("WebsiteEnabled").Bool()
+ if err != nil { // Default to true if not found
+ websiteEnabled = true
+ }
+
+ logLevel, err := cfg.Section("").Key("LogLevel").Int()
+ if err != nil || logLevel < 0 || logLevel > 4 { // Default to 1 if not found or out of range
+ logLevel = 1
+ }
+
+ config = Config{
+ Port: port,
+ AuthCode: cfg.Section("").Key("AuthCode").String(),
+ PeerID: cfg.Section("").Key("PeerID").String(),
+ Peers: peers,
+ Domain: domain,
+ NodesEnabled: nodesEnabled,
+ CrawlerEnabled: crawlerEnabled,
+ WebsiteEnabled: websiteEnabled,
+ LogLevel: logLevel,
+ }
+
+ return config
+}
+
+var configLock sync.RWMutex
+
+func startFileWatcher() {
+ watcher, err := fsnotify.NewWatcher()
+ if err != nil {
+ printErr("%v", err)
+ }
+
+ go func() {
+ defer watcher.Close()
+ for {
+ select {
+ case event, ok := <-watcher.Events:
+ if !ok {
+ return
+ }
+ if event.Op&fsnotify.Write == fsnotify.Write {
+ printInfo("Modified file: %v", event.Name)
+ configLock.Lock()
+ config = loadConfig()
+ configLock.Unlock()
+ }
+ case err, ok := <-watcher.Errors:
+ if !ok {
+ return
+ }
+ printWarn("Error watching configuration file: %v", err)
+ }
+ }
+ }()
+
+ err = watcher.Add(configFilePath)
+ if err != nil {
+ log.Fatal(err)
+ }
+}
diff --git a/files.go b/files.go
old mode 100644
new mode 100755
index bce60b0..cbafb66
--- a/files.go
+++ b/files.go
@@ -1,246 +1,261 @@
-package main
-
-import (
- "fmt"
- "html/template"
- "log"
- "net/http"
- "net/url"
- "regexp"
- "sort"
- "strconv"
- "strings"
- "time"
-)
-
-type Settings struct {
- UxLang string
- Safe string
-}
-
-type TorrentSite interface {
- Name() string
- Search(query string, category string) ([]TorrentResult, error)
-}
-
-var (
- torrentGalaxy TorrentSite
- nyaa TorrentSite
- thePirateBay TorrentSite
- rutor TorrentSite
-)
-
-func initializeTorrentSites() {
- torrentGalaxy = NewTorrentGalaxy()
- // nyaa = NewNyaa()
- thePirateBay = NewThePirateBay()
- // rutor = NewRutor()
-}
-
-func handleFileSearch(w http.ResponseWriter, query, safe, lang string, page int) {
- startTime := time.Now()
-
- cacheKey := CacheKey{Query: query, Page: page, Safe: safe == "true", Lang: lang, Type: "file"}
- combinedResults := getFileResultsFromCacheOrFetch(cacheKey, query, safe, lang, page)
-
- sort.Slice(combinedResults, func(i, j int) bool { return combinedResults[i].Seeders > combinedResults[j].Seeders })
-
- elapsedTime := time.Since(startTime)
- funcMap := template.FuncMap{
- "sub": subtract,
- "add": add,
- }
- tmpl, err := template.New("files.html").Funcs(funcMap).ParseFiles("templates/files.html")
- if err != nil {
- log.Printf("Failed to load template: %v", err)
- http.Error(w, "Failed to load template", http.StatusInternalServerError)
- return
- }
-
- data := struct {
- Results []TorrentResult
- Query string
- Fetched string
- Category string
- Sort string
- HasPrevPage bool
- HasNextPage bool
- Page int
- Settings Settings
- }{
- Results: combinedResults,
- Query: query,
- Fetched: fmt.Sprintf("%.2f", elapsedTime.Seconds()),
- Category: "all",
- Sort: "seed",
- HasPrevPage: page > 1,
- HasNextPage: len(combinedResults) > 0,
- Page: page,
- Settings: Settings{UxLang: lang, Safe: safe},
- }
-
- // Debugging: Print results before rendering template
- for _, result := range combinedResults {
- fmt.Printf("Title: %s, Magnet: %s\n", result.Title, result.Magnet)
- }
-
- if err := tmpl.Execute(w, data); err != nil {
- log.Printf("Failed to render template: %v", err)
- http.Error(w, "Failed to render template", http.StatusInternalServerError)
- }
-}
-
-func getFileResultsFromCacheOrFetch(cacheKey CacheKey, query, safe, lang string, page int) []TorrentResult {
- cacheChan := make(chan []SearchResult)
- var combinedResults []TorrentResult
-
- go func() {
- results, exists := resultsCache.Get(cacheKey)
- if exists {
- log.Println("Cache hit")
- cacheChan <- results
- } else {
- log.Println("Cache miss")
- cacheChan <- nil
- }
- }()
-
- select {
- case results := <-cacheChan:
- if results == nil {
- combinedResults = fetchAndCacheFileResults(query, safe, lang, page)
- } else {
- _, torrentResults, _ := convertToSpecificResults(results)
- combinedResults = torrentResults
- }
- case <-time.After(2 * time.Second):
- log.Println("Cache check timeout")
- combinedResults = fetchAndCacheFileResults(query, safe, lang, page)
- }
-
- return combinedResults
-}
-
-func fetchAndCacheFileResults(query, safe, lang string, page int) []TorrentResult {
- sites := []TorrentSite{torrentGalaxy, nyaa, thePirateBay, rutor}
- results := []TorrentResult{}
-
- for _, site := range sites {
- if site == nil {
- continue
- }
- res, err := site.Search(query, "all")
- if err != nil {
- continue
- }
- for _, r := range res {
- r.Magnet = removeMagnetLink(r.Magnet) // Remove "magnet:", prehaps usless now?
- results = append(results, r)
- }
- }
-
- // Cache the valid results
- cacheKey := CacheKey{Query: query, Page: page, Safe: safe == "true", Lang: lang, Type: "file"}
- resultsCache.Set(cacheKey, convertToSearchResults(results))
-
- return results
-}
-
-func removeMagnetLink(magnet string) string {
- // Remove the magnet: prefix unconditionally
- return strings.TrimPrefix(magnet, "magnet:")
-}
-
-func subtract(a, b int) int {
- return a - b
-}
-
-func add(a, b int) int {
- return a + b
-}
-
-func parseInt(s string) int {
- i, err := strconv.Atoi(s)
- if err != nil {
- return 0
- }
- return i
-}
-
-func parseSize(sizeStr string) int64 {
- sizeStr = strings.TrimSpace(sizeStr)
- if sizeStr == "" {
- return 0
- }
-
- // Use regex to extract numeric value and unit separately
- re := regexp.MustCompile(`(?i)([\d.]+)\s*([KMGT]?B)`)
- matches := re.FindStringSubmatch(sizeStr)
- if len(matches) < 3 {
- log.Printf("Error parsing size: invalid format %s", sizeStr)
- return 0
- }
-
- sizeStr = matches[1]
- unit := strings.ToUpper(matches[2])
-
- var multiplier int64 = 1
- switch unit {
- case "KB":
- multiplier = 1024
- case "MB":
- multiplier = 1024 * 1024
- case "GB":
- multiplier = 1024 * 1024 * 1024
- case "TB":
- multiplier = 1024 * 1024 * 1024 * 1024
- default:
- log.Printf("Unknown unit: %s", unit)
- return 0
- }
-
- size, err := strconv.ParseFloat(sizeStr, 64)
- if err != nil {
- log.Printf("Error parsing size: %v", err)
- return 0
- }
- return int64(size * float64(multiplier))
-}
-
-// apparently this is needed so it can announce that magnet link is being used and people start seeding it, but I dont like the fact that I add trackers purposefully
-func applyTrackers(magnetLink string) string {
- if magnetLink == "" {
- return ""
- }
- trackers := []string{
- "udp://tracker.openbittorrent.com:80/announce",
- "udp://tracker.opentrackr.org:1337/announce",
- "udp://tracker.coppersurfer.tk:6969/announce",
- "udp://tracker.leechers-paradise.org:6969/announce",
- }
- for _, tracker := range trackers {
- magnetLink += "&tr=" + url.QueryEscape(tracker)
- }
- return magnetLink
-}
-
-func formatSize(size int64) string {
- if size >= 1024*1024*1024*1024 {
- return fmt.Sprintf("%.2f TB", float64(size)/(1024*1024*1024*1024))
- } else if size >= 1024*1024*1024 {
- return fmt.Sprintf("%.2f GB", float64(size)/(1024*1024*1024))
- } else if size >= 1024*1024 {
- return fmt.Sprintf("%.2f MB", float64(size)/(1024*1024))
- } else if size >= 1024 {
- return fmt.Sprintf("%.2f KB", float64(size)/1024)
- }
- return fmt.Sprintf("%d B", size)
-}
-
-func sanitizeFileName(name string) string {
- // Replace spaces with dashes
- sanitized := regexp.MustCompile(`\s+`).ReplaceAllString(name, "-")
- // Remove any characters that are not alphanumeric, dashes, or parentheses
- sanitized = regexp.MustCompile(`[^a-zA-Z0-9\-\(\)]`).ReplaceAllString(sanitized, "")
- return sanitized
-}
+package main
+
+import (
+ "fmt"
+ "html/template"
+ "net/http"
+ "net/url"
+ "regexp"
+ "sort"
+ "strconv"
+ "strings"
+ "time"
+)
+
+type Settings struct {
+ UxLang string
+ Safe string
+}
+
+type TorrentSite interface {
+ Name() string
+ Search(query string, category string) ([]TorrentResult, error)
+}
+
+var (
+ torrentGalaxy TorrentSite
+ nyaa TorrentSite
+ thePirateBay TorrentSite
+ rutor TorrentSite
+)
+
+var fileResultsChan = make(chan []TorrentResult)
+
+func initializeTorrentSites() {
+ torrentGalaxy = NewTorrentGalaxy()
+ // nyaa = NewNyaa()
+ thePirateBay = NewThePirateBay()
+ // rutor = NewRutor()
+}
+
+func handleFileSearch(w http.ResponseWriter, settings UserSettings, query string, page int) {
+ startTime := time.Now()
+
+ cacheKey := CacheKey{Query: query, Page: page, Safe: settings.SafeSearch == "true", Lang: settings.Language, Type: "file"}
+ combinedResults := getFileResultsFromCacheOrFetch(cacheKey, query, settings.SafeSearch, settings.Language, page)
+
+ sort.Slice(combinedResults, func(i, j int) bool { return combinedResults[i].Seeders > combinedResults[j].Seeders })
+
+ elapsedTime := time.Since(startTime)
+ funcMap := template.FuncMap{
+ "sub": func(a, b int) int { return a - b },
+ "add": func(a, b int) int { return a + b },
+ }
+ tmpl, err := template.New("files.html").Funcs(funcMap).ParseFiles("templates/files.html")
+ if err != nil {
+ printErr("Failed to load template: %v", err)
+ http.Error(w, "Failed to load template", http.StatusInternalServerError)
+ return
+ }
+
+ data := struct {
+ Results []TorrentResult
+ Query string
+ Fetched string
+ Category string
+ Sort string
+ Page int
+ HasPrevPage bool
+ HasNextPage bool
+ LanguageOptions []LanguageOption
+ CurrentLang string
+ Theme string
+ Safe string
+ }{
+ Results: combinedResults,
+ Query: query,
+ Fetched: fmt.Sprintf("%.2f seconds", elapsedTime.Seconds()),
+ Category: "all",
+ Sort: "seed",
+ Page: page,
+ HasPrevPage: page > 1,
+ HasNextPage: len(combinedResults) > 0,
+ LanguageOptions: languageOptions,
+ CurrentLang: settings.Language,
+ Theme: settings.Theme,
+ Safe: settings.SafeSearch,
+ }
+
+ // // Debugging: Print results before rendering template
+ // for _, result := range combinedResults {
+ // fmt.Printf("Title: %s, Magnet: %s\n", result.Title, result.Magnet)
+ // }
+
+ if err := tmpl.Execute(w, data); err != nil {
+ printErr("Failed to render template: %v", err)
+ http.Error(w, "Failed to render template", http.StatusInternalServerError)
+ }
+}
+
+func getFileResultsFromCacheOrFetch(cacheKey CacheKey, query, safe, lang string, page int) []TorrentResult {
+ cacheChan := make(chan []SearchResult)
+ var combinedResults []TorrentResult
+
+ go func() {
+ results, exists := resultsCache.Get(cacheKey)
+ if exists {
+ printInfo("Cache hit")
+ cacheChan <- results
+ } else {
+ printInfo("Cache miss")
+ cacheChan <- nil
+ }
+ }()
+
+ select {
+ case results := <-cacheChan:
+ if results == nil {
+ combinedResults = fetchFileResults(query, safe, lang, page)
+ if len(combinedResults) > 0 {
+ resultsCache.Set(cacheKey, convertToSearchResults(combinedResults))
+ }
+ } else {
+ _, torrentResults, _ := convertToSpecificResults(results)
+ combinedResults = torrentResults
+ }
+ case <-time.After(2 * time.Second):
+ printInfo("Cache check timeout")
+ combinedResults = fetchFileResults(query, safe, lang, page)
+ if len(combinedResults) > 0 {
+ resultsCache.Set(cacheKey, convertToSearchResults(combinedResults))
+ }
+ }
+
+ return combinedResults
+}
+
+func fetchFileResults(query, safe, lang string, page int) []TorrentResult {
+ sites := []TorrentSite{torrentGalaxy, nyaa, thePirateBay, rutor}
+ results := []TorrentResult{}
+
+ for _, site := range sites {
+ if site == nil {
+ continue
+ }
+ res, err := site.Search(query, "all")
+ if err != nil {
+ continue
+ }
+ for _, r := range res {
+ r.Magnet = removeMagnetLink(r.Magnet) // Remove "magnet:", prehaps usless now?
+ results = append(results, r)
+ }
+ }
+
+ if len(results) == 0 {
+ printWarn("No file results found for query: %s, trying other nodes", query)
+ results = tryOtherNodesForFileSearch(query, safe, lang, page, []string{hostID})
+ }
+
+ return results
+}
+
+func removeMagnetLink(magnet string) string {
+ // Remove the magnet: prefix unconditionally
+ return strings.TrimPrefix(magnet, "magnet:")
+}
+
+func parseInt(s string) int {
+ i, err := strconv.Atoi(s)
+ if err != nil {
+ return 0
+ }
+ return i
+}
+
+func parseSize(sizeStr string) int64 {
+ sizeStr = strings.TrimSpace(sizeStr)
+ if sizeStr == "" {
+ return 0
+ }
+
+ // Use regex to extract numeric value and unit separately
+ re := regexp.MustCompile(`(?i)([\d.]+)\s*([KMGT]?B)`)
+ matches := re.FindStringSubmatch(sizeStr)
+ if len(matches) < 3 {
+ printWarn("Error parsing size: invalid format %s", sizeStr)
+ return 0
+ }
+
+ sizeStr = matches[1]
+ unit := strings.ToUpper(matches[2])
+
+ var multiplier int64 = 1
+ switch unit {
+ case "KB":
+ multiplier = 1024
+ case "MB":
+ multiplier = 1024 * 1024
+ case "GB":
+ multiplier = 1024 * 1024 * 1024
+ case "TB":
+ multiplier = 1024 * 1024 * 1024 * 1024
+ default:
+ printWarn("Unknown unit: %s", unit)
+ return 0
+ }
+
+ size, err := strconv.ParseFloat(sizeStr, 64)
+ if err != nil {
+ printWarn("Error parsing size: %v", err)
+ return 0
+ }
+ return int64(size * float64(multiplier))
+}
+
+// apparently this is needed so it can announce that magnet link is being used and people start seeding it, but I dont like the fact that I add trackers purposefully
+func applyTrackers(magnetLink string) string {
+ if magnetLink == "" {
+ return ""
+ }
+ trackers := []string{
+ "udp://tracker.openbittorrent.com:80/announce",
+ "udp://tracker.opentrackr.org:1337/announce",
+ "udp://tracker.coppersurfer.tk:6969/announce",
+ "udp://tracker.leechers-paradise.org:6969/announce",
+ }
+ for _, tracker := range trackers {
+ magnetLink += "&tr=" + url.QueryEscape(tracker)
+ }
+ return magnetLink
+}
+
+func formatSize(size int64) string {
+ if size >= 1024*1024*1024*1024 {
+ return fmt.Sprintf("%.2f TB", float64(size)/(1024*1024*1024*1024))
+ } else if size >= 1024*1024*1024 {
+ return fmt.Sprintf("%.2f GB", float64(size)/(1024*1024*1024))
+ } else if size >= 1024*1024 {
+ return fmt.Sprintf("%.2f MB", float64(size)/(1024*1024))
+ } else if size >= 1024 {
+ return fmt.Sprintf("%.2f KB", float64(size)/1024)
+ }
+ return fmt.Sprintf("%d B", size)
+}
+
+func sanitizeFileName(name string) string {
+ // Replace spaces with dashes
+ sanitized := regexp.MustCompile(`\s+`).ReplaceAllString(name, "-")
+ // Remove any characters that are not alphanumeric, dashes, or parentheses
+ sanitized = regexp.MustCompile(`[^a-zA-Z0-9\-\(\)]`).ReplaceAllString(sanitized, "")
+ return sanitized
+}
+
+func contains(slice []string, item string) bool {
+ for _, v := range slice {
+ if v == item {
+ return true
+ }
+ }
+ return false
+}
diff --git a/forums.go b/forums.go
old mode 100644
new mode 100755
index dc32012..d0be4f6
--- a/forums.go
+++ b/forums.go
@@ -1,139 +1,144 @@
-package main
-
-import (
- "encoding/json"
- "fmt"
- "html/template"
- "math"
- "net/http"
- "net/url"
- "time"
-)
-
-func PerformRedditSearch(query string, safe string, page int) ([]ForumSearchResult, error) {
- const (
- pageSize = 25
- baseURL = "https://www.reddit.com"
- maxRetries = 5
- initialBackoff = 2 * time.Second
- )
- var results []ForumSearchResult
-
- searchURL := fmt.Sprintf("%s/search.json?q=%s&limit=%d&start=%d", baseURL, url.QueryEscape(query), pageSize, page*pageSize)
- var resp *http.Response
- var err error
-
- // Retry logic with exponential backoff
- for i := 0; i <= maxRetries; i++ {
- resp, err = http.Get(searchURL)
- if err != nil {
- return nil, fmt.Errorf("making request: %v", err)
- }
- if resp.StatusCode != http.StatusTooManyRequests {
- break
- }
-
- // Wait for some time before retrying
- backoff := time.Duration(math.Pow(2, float64(i))) * initialBackoff
- time.Sleep(backoff)
- }
-
- if err != nil {
- return nil, fmt.Errorf("making request: %v", err)
- }
- defer resp.Body.Close()
-
- if resp.StatusCode != http.StatusOK {
- return nil, fmt.Errorf("unexpected status code: %d", resp.StatusCode)
- }
-
- var searchResults map[string]interface{}
- if err := json.NewDecoder(resp.Body).Decode(&searchResults); err != nil {
- return nil, fmt.Errorf("decoding response: %v", err)
- }
-
- data, ok := searchResults["data"].(map[string]interface{})
- if !ok {
- return nil, fmt.Errorf("no data field in response")
- }
-
- posts, ok := data["children"].([]interface{})
- if !ok {
- return nil, fmt.Errorf("no children field in data")
- }
-
- for _, post := range posts {
- postData := post.(map[string]interface{})["data"].(map[string]interface{})
-
- if safe == "active" && postData["over_18"].(bool) {
- continue
- }
-
- header := postData["title"].(string)
- description := postData["selftext"].(string)
- if len(description) > 500 {
- description = description[:500] + "..."
- }
- publishedDate := time.Unix(int64(postData["created_utc"].(float64)), 0)
- permalink := postData["permalink"].(string)
- resultURL := fmt.Sprintf("%s%s", baseURL, permalink)
-
- result := ForumSearchResult{
- URL: resultURL,
- Header: header,
- Description: description,
- PublishedDate: publishedDate,
- }
-
- thumbnail := postData["thumbnail"].(string)
- if parsedURL, err := url.Parse(thumbnail); err == nil && parsedURL.Scheme != "" {
- result.ImgSrc = postData["url"].(string)
- result.ThumbnailSrc = thumbnail
- }
-
- results = append(results, result)
- }
-
- return results, nil
-}
-
-func handleForumsSearch(w http.ResponseWriter, query, safe, lang string, page int) {
- results, err := PerformRedditSearch(query, safe, page)
- if err != nil {
- http.Error(w, fmt.Sprintf("Error performing search: %v", err), http.StatusInternalServerError)
- return
- }
-
- data := struct {
- Query string
- Results []ForumSearchResult
- LanguageOptions []LanguageOption
- CurrentLang string
- Page int
- HasPrevPage bool
- HasNextPage bool
- }{
- Query: query,
- Results: results,
- LanguageOptions: languageOptions,
- CurrentLang: lang,
- Page: page,
- HasPrevPage: page > 1,
- HasNextPage: len(results) == 25,
- }
-
- funcMap := template.FuncMap{
- "sub": func(a, b int) int { return a - b },
- "add": func(a, b int) int { return a + b },
- }
-
- tmpl, err := template.New("forums.html").Funcs(funcMap).ParseFiles("templates/forums.html")
- if err != nil {
- http.Error(w, fmt.Sprintf("Error loading template: %v", err), http.StatusInternalServerError)
- return
- }
-
- if err := tmpl.Execute(w, data); err != nil {
- http.Error(w, fmt.Sprintf("Error rendering template: %v", err), http.StatusInternalServerError)
- }
-}
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "html/template"
+ "log"
+ "math"
+ "net/http"
+ "net/url"
+ "time"
+)
+
+func PerformRedditSearch(query string, safe string, page int) ([]ForumSearchResult, error) {
+ const (
+ pageSize = 25
+ baseURL = "https://www.reddit.com"
+ maxRetries = 5
+ initialBackoff = 2 * time.Second
+ )
+ var results []ForumSearchResult
+
+ searchURL := fmt.Sprintf("%s/search.json?q=%s&limit=%d&start=%d", baseURL, url.QueryEscape(query), pageSize, page*pageSize)
+ var resp *http.Response
+ var err error
+
+ // Retry logic with exponential backoff
+ for i := 0; i <= maxRetries; i++ {
+ resp, err = http.Get(searchURL)
+ if err != nil {
+ return nil, fmt.Errorf("making request: %v", err)
+ }
+ if resp.StatusCode != http.StatusTooManyRequests {
+ break
+ }
+
+ // Wait for some time before retrying
+ backoff := time.Duration(math.Pow(2, float64(i))) * initialBackoff
+ time.Sleep(backoff)
+ }
+
+ if err != nil {
+ return nil, fmt.Errorf("making request: %v", err)
+ }
+ defer resp.Body.Close()
+
+ if resp.StatusCode != http.StatusOK {
+ return nil, fmt.Errorf("unexpected status code: %d", resp.StatusCode)
+ }
+
+ var searchResults map[string]interface{}
+ if err := json.NewDecoder(resp.Body).Decode(&searchResults); err != nil {
+ return nil, fmt.Errorf("decoding response: %v", err)
+ }
+
+ data, ok := searchResults["data"].(map[string]interface{})
+ if !ok {
+ return nil, fmt.Errorf("no data field in response")
+ }
+
+ posts, ok := data["children"].([]interface{})
+ if !ok {
+ return nil, fmt.Errorf("no children field in data")
+ }
+
+ for _, post := range posts {
+ postData := post.(map[string]interface{})["data"].(map[string]interface{})
+
+ if safe == "active" && postData["over_18"].(bool) {
+ continue
+ }
+
+ header := postData["title"].(string)
+ description := postData["selftext"].(string)
+ if len(description) > 500 {
+ description = description[:500] + "..."
+ }
+ publishedDate := time.Unix(int64(postData["created_utc"].(float64)), 0)
+ permalink := postData["permalink"].(string)
+ resultURL := fmt.Sprintf("%s%s", baseURL, permalink)
+
+ result := ForumSearchResult{
+ URL: resultURL,
+ Header: header,
+ Description: description,
+ PublishedDate: publishedDate,
+ }
+
+ thumbnail := postData["thumbnail"].(string)
+ if parsedURL, err := url.Parse(thumbnail); err == nil && parsedURL.Scheme != "" {
+ result.ImgSrc = postData["url"].(string)
+ result.ThumbnailSrc = thumbnail
+ }
+
+ results = append(results, result)
+ }
+
+ return results, nil
+}
+
+func handleForumsSearch(w http.ResponseWriter, settings UserSettings, query string, page int) {
+ results, err := PerformRedditSearch(query, settings.SafeSearch, page)
+ if err != nil || len(results) == 0 { // 0 == 0 to force search by other node
+ log.Printf("No results from primary search, trying other nodes")
+ results = tryOtherNodesForForumSearch(query, settings.SafeSearch, settings.Language, page)
+ }
+
+ data := struct {
+ Query string
+ Results []ForumSearchResult
+ Page int
+ HasPrevPage bool
+ HasNextPage bool
+ LanguageOptions []LanguageOption
+ CurrentLang string
+ Theme string
+ Safe string
+ }{
+ Query: query,
+ Results: results,
+ Page: page,
+ HasPrevPage: page > 1,
+ HasNextPage: len(results) == 25,
+ LanguageOptions: languageOptions,
+ CurrentLang: settings.Language,
+ Theme: settings.Theme,
+ Safe: settings.SafeSearch,
+ }
+
+ funcMap := template.FuncMap{
+ "sub": func(a, b int) int { return a - b },
+ "add": func(a, b int) int { return a + b },
+ }
+
+ tmpl, err := template.New("forums.html").Funcs(funcMap).ParseFiles("templates/forums.html")
+ if err != nil {
+ http.Error(w, fmt.Sprintf("Error loading template: %v", err), http.StatusInternalServerError)
+ return
+ }
+
+ if err := tmpl.Execute(w, data); err != nil {
+ http.Error(w, fmt.Sprintf("Error rendering template: %v", err), http.StatusInternalServerError)
+ }
+}
diff --git a/go.mod b/go.mod
index 1b9bbf4..f6755b1 100644
--- a/go.mod
+++ b/go.mod
@@ -9,15 +9,20 @@ require (
github.com/chromedp/cdproto v0.0.0-20240202021202-6d0b6a386732 // indirect
github.com/chromedp/chromedp v0.9.5 // indirect
github.com/chromedp/sysutil v1.0.0 // indirect
+ github.com/go-ole/go-ole v1.2.6 // indirect
github.com/gobwas/httphead v0.1.0 // indirect
github.com/gobwas/pool v0.2.1 // indirect
github.com/gobwas/ws v1.3.2 // indirect
github.com/josharian/intern v1.0.0 // indirect
github.com/mailru/easyjson v0.7.7 // indirect
github.com/shirou/gopsutil v3.21.11+incompatible
+ github.com/yusufpapurcu/wmi v1.2.4 // indirect
golang.org/x/net v0.21.0 // indirect
golang.org/x/sys v0.17.0 // indirect
golang.org/x/time v0.5.0 // indirect
- github.com/go-ole/go-ole v1.2.6 // indirect
- github.com/yusufpapurcu/wmi v1.2.4 // indirect
-)
\ No newline at end of file
+)
+
+require (
+ github.com/fsnotify/fsnotify v1.7.0 // indirect
+ gopkg.in/ini.v1 v1.67.0 // indirect
+)
diff --git a/go.sum b/go.sum
index cef914e..7fab23f 100644
--- a/go.sum
+++ b/go.sum
@@ -8,6 +8,8 @@ github.com/chromedp/chromedp v0.9.5 h1:viASzruPJOiThk7c5bueOUY91jGLJVximoEMGoH93
github.com/chromedp/chromedp v0.9.5/go.mod h1:D4I2qONslauw/C7INoCir1BJkSwBYMyZgx8X276z3+Y=
github.com/chromedp/sysutil v1.0.0 h1:+ZxhTpfpZlmchB58ih/LBHX52ky7w2VhQVKQMucy3Ic=
github.com/chromedp/sysutil v1.0.0/go.mod h1:kgWmDdq8fTzXYcKIBqIYvRRTnYb9aNS9moAV0xufSww=
+github.com/fsnotify/fsnotify v1.7.0 h1:8JEhPFa5W2WU7YfeZzPNqzMP6Lwt7L2715Ggo0nosvA=
+github.com/fsnotify/fsnotify v1.7.0/go.mod h1:40Bi/Hjc2AVfZrqy+aj+yEI+/bRxZnMJyTJwOpGvigM=
github.com/go-ole/go-ole v1.2.6 h1:/Fpf6oFPoeFik9ty7siob0G6Ke8QvQEuVcuChpwXzpY=
github.com/go-ole/go-ole v1.2.6/go.mod h1:pprOEPIfldk/42T2oK7lQ4v4JSDwmV0As9GaiUsvbm0=
github.com/gobwas/httphead v0.1.0 h1:exrUm0f4YX0L7EBwZHuCF4GDp8aJfVeBrlLQrs6NqWU=
@@ -69,3 +71,5 @@ golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtn
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
+gopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=
+gopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
diff --git a/imageproxy.go b/imageproxy.go
index f828116..74a1cbb 100644
--- a/imageproxy.go
+++ b/imageproxy.go
@@ -1,8 +1,8 @@
package main
import (
+ "fmt"
"io"
- "log"
"net/http"
)
@@ -14,12 +14,20 @@ func handleImageProxy(w http.ResponseWriter, r *http.Request) {
return
}
- // Fetch the image from the external URL
- resp, err := http.Get(imageURL)
- if err != nil {
- log.Printf("Error fetching image: %v", err)
- http.Error(w, "Internal Server Error", http.StatusInternalServerError)
- return
+ // Try to fetch the image from Bing first
+ bingURL := fmt.Sprintf("https://tse.mm.bing.net/th?q=%s", imageURL)
+ resp, err := http.Get(bingURL)
+ if err != nil || resp.StatusCode != http.StatusOK {
+ // If fetching from Bing fails, attempt to fetch from the original image URL
+ printWarn("Error fetching image from Bing, trying original URL.")
+
+ // Attempt to fetch the image directly
+ resp, err = http.Get(imageURL)
+ if err != nil {
+ printWarn("Error fetching image: %v", err)
+ http.Error(w, "Internal Server Error", http.StatusInternalServerError)
+ return
+ }
}
defer resp.Body.Close()
@@ -40,7 +48,7 @@ func handleImageProxy(w http.ResponseWriter, r *http.Request) {
// Write the image content to the response
if _, err := io.Copy(w, resp.Body); err != nil {
- log.Printf("Error writing image to response: %v", err)
+ printWarn("Error writing image to response: %v", err)
http.Error(w, "Internal Server Error", http.StatusInternalServerError)
}
}
diff --git a/images-bing.go b/images-bing.go
new file mode 100644
index 0000000..a9f8717
--- /dev/null
+++ b/images-bing.go
@@ -0,0 +1,107 @@
+package main
+
+import (
+ "fmt"
+ "net/http"
+ "net/url"
+ "strconv"
+ "strings"
+ "time"
+
+ "github.com/PuerkitoBio/goquery"
+)
+
+func PerformBingImageSearch(query, safe, lang string, page int) ([]ImageSearchResult, time.Duration, error) {
+ startTime := time.Now()
+
+ // Build the search URL
+ searchURL := buildBingSearchURL(query, page)
+
+ // Make the HTTP request
+ resp, err := http.Get(searchURL)
+ if err != nil {
+ return nil, 0, fmt.Errorf("making request: %v", err)
+ }
+ defer resp.Body.Close()
+
+ if resp.StatusCode != http.StatusOK {
+ return nil, 0, fmt.Errorf("unexpected status code: %d", resp.StatusCode)
+ }
+
+ // Parse the HTML document
+ doc, err := goquery.NewDocumentFromReader(resp.Body)
+ if err != nil {
+ return nil, 0, fmt.Errorf("loading HTML document: %v", err)
+ }
+
+ // Extract data using goquery
+ var results []ImageSearchResult
+ doc.Find(".imgpt").Each(func(i int, s *goquery.Selection) {
+ imgTag := s.Find("img")
+ imgSrc, exists := imgTag.Attr("src")
+ if !exists {
+ return
+ }
+
+ title, _ := imgTag.Attr("alt")
+
+ // Extract width and height if available
+ width, _ := strconv.Atoi(imgTag.AttrOr("width", "0"))
+ height, _ := strconv.Atoi(imgTag.AttrOr("height", "0"))
+
+ // Extract the original image URL from the `mediaurl` parameter in the link
+ pageLink, exists := s.Find("a.iusc").Attr("href")
+ mediaURL := ""
+ if exists {
+ if u, err := url.Parse(pageLink); err == nil {
+ if mediaURLParam := u.Query().Get("mediaurl"); mediaURLParam != "" {
+ mediaURL, _ = url.QueryUnescape(mediaURLParam)
+ }
+ }
+ }
+
+ results = append(results, ImageSearchResult{
+ Thumbnail: imgSrc,
+ Title: strings.TrimSpace(title),
+ Media: imgSrc,
+ Width: width,
+ Height: height,
+ Source: mediaURL, // Original image URL
+ ThumbProxy: imgSrc,
+ })
+ })
+
+ duration := time.Since(startTime)
+
+ // Check if the number of results is one or less
+ if len(results) <= 1 {
+ return nil, duration, fmt.Errorf("no images found")
+ }
+
+ return results, duration, nil
+}
+
+func buildBingSearchURL(query string, page int) string {
+ baseURL := "https://www.bing.com/images/search"
+ params := url.Values{}
+ params.Add("q", query)
+ params.Add("first", fmt.Sprintf("%d", (page-1)*35+1)) // Pagination, but increasing it doesn't seem to make a difference
+ params.Add("count", "35")
+ params.Add("form", "HDRSC2")
+ return baseURL + "?" + params.Encode()
+}
+
+// func main() {
+// results, duration, err := PerformBingImageSearch("kittens", "false", "en", 1)
+// if err != nil {
+// fmt.Println("Error:", err)
+// return
+// }
+
+// fmt.Printf("Search took: %v\n", duration)
+// fmt.Printf("Total results: %d\n", len(results))
+// for _, result := range results {
+// fmt.Printf("Title: %s\nThumbnail: %s\nWidth: %d\nHeight: %d\nThumbProxy: %s\nSource (Original Image URL): %s\n\n",
+// result.Title, result.Thumbnail, result.Width, result.Height, result.ThumbProxy, result.Source)
+// }
+// }
diff --git a/images.go b/images.go
old mode 100644
new mode 100755
index 84c1366..bdac1cd
--- a/images.go
+++ b/images.go
@@ -1,141 +1,149 @@
-package main
-
-import (
- "fmt"
- "html/template"
- "log"
- "net/http"
- "time"
-)
-
-var imageSearchEngines []SearchEngine
-
-func init() {
- imageSearchEngines = []SearchEngine{
- {Name: "Qwant", Func: wrapImageSearchFunc(PerformQwantImageSearch), Weight: 1},
- {Name: "Imgur", Func: wrapImageSearchFunc(PerformImgurImageSearch), Weight: 2},
- }
-}
-
-func handleImageSearch(w http.ResponseWriter, query, safe, lang string, page int) {
- startTime := time.Now()
-
- cacheKey := CacheKey{Query: query, Page: page, Safe: safe == "true", Lang: lang, Type: "image"}
- combinedResults := getImageResultsFromCacheOrFetch(cacheKey, query, safe, lang, page)
-
- elapsedTime := time.Since(startTime)
- tmpl, err := template.New("images.html").Funcs(funcs).ParseFiles("templates/images.html")
- if err != nil {
- log.Printf("Error parsing template: %v", err)
- http.Error(w, "Internal Server Error", http.StatusInternalServerError)
- return
- }
-
- data := struct {
- Results []ImageSearchResult
- Query string
- Page int
- Fetched string
- LanguageOptions []LanguageOption
- CurrentLang string
- HasPrevPage bool
- HasNextPage bool
- }{
- Results: combinedResults,
- Query: query,
- Page: page,
- Fetched: fmt.Sprintf("%.2f seconds", elapsedTime.Seconds()),
- LanguageOptions: languageOptions,
- CurrentLang: lang,
- HasPrevPage: page > 1,
- HasNextPage: len(combinedResults) >= 50,
- }
-
- err = tmpl.Execute(w, data)
- if err != nil {
- log.Printf("Error executing template: %v", err)
- http.Error(w, "Internal Server Error", http.StatusInternalServerError)
- }
-}
-
-func getImageResultsFromCacheOrFetch(cacheKey CacheKey, query, safe, lang string, page int) []ImageSearchResult {
- cacheChan := make(chan []SearchResult)
- var combinedResults []ImageSearchResult
-
- go func() {
- results, exists := resultsCache.Get(cacheKey)
- if exists {
- log.Println("Cache hit")
- cacheChan <- results
- } else {
- log.Println("Cache miss")
- cacheChan <- nil
- }
- }()
-
- select {
- case results := <-cacheChan:
- if results == nil {
- combinedResults = fetchImageResults(query, safe, lang, page)
- if len(combinedResults) > 0 {
- resultsCache.Set(cacheKey, convertToSearchResults(combinedResults))
- }
- } else {
- _, _, imageResults := convertToSpecificResults(results)
- combinedResults = imageResults
- }
- case <-time.After(2 * time.Second):
- log.Println("Cache check timeout")
- combinedResults = fetchImageResults(query, safe, lang, page)
- if len(combinedResults) > 0 {
- resultsCache.Set(cacheKey, convertToSearchResults(combinedResults))
- }
- }
-
- return combinedResults
-}
-
-func fetchImageResults(query, safe, lang string, page int) []ImageSearchResult {
- var results []ImageSearchResult
-
- for _, engine := range imageSearchEngines {
- log.Printf("Using image search engine: %s", engine.Name)
-
- searchResults, duration, err := engine.Func(query, safe, lang, page)
- updateEngineMetrics(&engine, duration, err == nil)
- if err != nil {
- log.Printf("Error performing image search with %s: %v", engine.Name, err)
- continue
- }
-
- for _, result := range searchResults {
- results = append(results, result.(ImageSearchResult))
- }
-
- // If results are found, break out of the loop
- if len(results) > 0 {
- break
- }
- }
-
- // If no results found after trying all engines
- if len(results) == 0 {
- log.Printf("No image results found for query: %s", query)
- }
-
- return results
-}
-
-func wrapImageSearchFunc(f func(string, string, string, int) ([]ImageSearchResult, time.Duration, error)) func(string, string, string, int) ([]SearchResult, time.Duration, error) {
- return func(query, safe, lang string, page int) ([]SearchResult, time.Duration, error) {
- imageResults, duration, err := f(query, safe, lang, page)
- if err != nil {
- return nil, duration, err
- }
- searchResults := make([]SearchResult, len(imageResults))
- for i, result := range imageResults {
- searchResults[i] = result
- }
- return searchResults, duration, nil
- }
-}
+package main
+
+import (
+ "fmt"
+ "html/template"
+ "log"
+ "net/http"
+ "time"
+)
+
+var imageSearchEngines []SearchEngine
+
+func init() {
+ imageSearchEngines = []SearchEngine{
+ {Name: "Qwant", Func: wrapImageSearchFunc(PerformQwantImageSearch), Weight: 1},
+ {Name: "Bing", Func: wrapImageSearchFunc(PerformBingImageSearch), Weight: 2}, // Bing sometimes returns with low amount of images, this leads to danamica page loading not working
+ {Name: "Imgur", Func: wrapImageSearchFunc(PerformImgurImageSearch), Weight: 3},
+ }
+}
+
+func handleImageSearch(w http.ResponseWriter, settings UserSettings, query string, page int) {
+ startTime := time.Now()
+
+ cacheKey := CacheKey{Query: query, Page: page, Safe: settings.SafeSearch == "true", Lang: settings.Language, Type: "image"}
+ combinedResults := getImageResultsFromCacheOrFetch(cacheKey, query, settings.SafeSearch, settings.Language, page)
+
+ elapsedTime := time.Since(startTime)
+ tmpl, err := template.New("images.html").Funcs(funcs).ParseFiles("templates/images.html")
+ if err != nil {
+ log.Printf("Error parsing template: %v", err)
+ http.Error(w, "Internal Server Error", http.StatusInternalServerError)
+ return
+ }
+
+ data := struct {
+ Results []ImageSearchResult
+ Query string
+ Page int
+ Fetched string
+ HasPrevPage bool
+ HasNextPage bool
+ NoResults bool
+ LanguageOptions []LanguageOption
+ CurrentLang string
+ Theme string
+ Safe string
+ }{
+ Results: combinedResults,
+ Query: query,
+ Page: page,
+ Fetched: fmt.Sprintf("%.2f seconds", elapsedTime.Seconds()),
+ HasPrevPage: page > 1,
+ HasNextPage: len(combinedResults) >= 50,
+ NoResults: len(combinedResults) == 0,
+ LanguageOptions: languageOptions,
+ CurrentLang: settings.Language,
+ Theme: settings.Theme,
+ Safe: settings.SafeSearch,
+ }
+
+ err = tmpl.Execute(w, data)
+ if err != nil {
+ printErr("Error executing template: %v", err)
+ http.Error(w, "Internal Server Error", http.StatusInternalServerError)
+ }
+}
+
+func getImageResultsFromCacheOrFetch(cacheKey CacheKey, query, safe, lang string, page int) []ImageSearchResult {
+ cacheChan := make(chan []SearchResult)
+ var combinedResults []ImageSearchResult
+
+ go func() {
+ results, exists := resultsCache.Get(cacheKey)
+ if exists {
+ printInfo("Cache hit")
+ cacheChan <- results
+ } else {
+ printInfo("Cache miss")
+ cacheChan <- nil
+ }
+ }()
+
+ select {
+ case results := <-cacheChan:
+ if results == nil {
+ combinedResults = fetchImageResults(query, safe, lang, page)
+ if len(combinedResults) > 0 {
+ resultsCache.Set(cacheKey, convertToSearchResults(combinedResults))
+ }
+ } else {
+ _, _, imageResults := convertToSpecificResults(results)
+ combinedResults = imageResults
+ }
+ case <-time.After(2 * time.Second):
+ printInfo("Cache check timeout")
+ combinedResults = fetchImageResults(query, safe, lang, page)
+ if len(combinedResults) > 0 {
+ resultsCache.Set(cacheKey, convertToSearchResults(combinedResults))
+ }
+ }
+
+ return combinedResults
+}
+
+func fetchImageResults(query, safe, lang string, page int) []ImageSearchResult {
+ var results []ImageSearchResult
+
+ for _, engine := range imageSearchEngines {
+ printInfo("Using image search engine: %s", engine.Name)
+
+ searchResults, duration, err := engine.Func(query, safe, lang, page)
+ updateEngineMetrics(&engine, duration, err == nil)
+ if err != nil {
+ printWarn("Error performing image search with %s: %v", engine.Name, err)
+ continue
+ }
+
+ for _, result := range searchResults {
+ results = append(results, result.(ImageSearchResult))
+ }
+
+ // If results are found, break out of the loop
+ if len(results) > 0 {
+ break
+ }
+ }
+
+ // If no results found after trying all engines
+ if len(results) == 0 {
+ printWarn("No image results found for query: %s, trying other nodes", query)
+ results = tryOtherNodesForImageSearch(query, safe, lang, page, []string{hostID})
+ }
+
+ return results
+}
+
+func wrapImageSearchFunc(f func(string, string, string, int) ([]ImageSearchResult, time.Duration, error)) func(string, string, string, int) ([]SearchResult, time.Duration, error) {
+ return func(query, safe, lang string, page int) ([]SearchResult, time.Duration, error) {
+ imageResults, duration, err := f(query, safe, lang, page)
+ if err != nil {
+ return nil, duration, err
+ }
+ searchResults := make([]SearchResult, len(imageResults))
+ for i, result := range imageResults {
+ searchResults[i] = result
+ }
+ return searchResults, duration, nil
+ }
+}
diff --git a/init.go b/init.go
index b3129a4..8074dd5 100644
--- a/init.go
+++ b/init.go
@@ -1,116 +1,66 @@
package main
import (
- "bufio"
- "encoding/json"
- "fmt"
- "log"
- "os"
- "strconv"
+ "time"
)
-// Configuration structure
type Config struct {
- Port int
- OpenSearch OpenSearchConfig
+ Port int
+ AuthCode string
+ PeerID string
+ Peers []string
+ Domain string
+ NodesEnabled bool
+ CrawlerEnabled bool
+ WebsiteEnabled bool
+ LogLevel int
}
-type OpenSearchConfig struct {
- Domain string
-}
-
-// Default configuration values
var defaultConfig = Config{
- Port: 5000,
- OpenSearch: OpenSearchConfig{
- Domain: "localhost",
- },
+ Port: 5000,
+ Domain: "localhost",
+ Peers: []string{},
+ AuthCode: generateStrongRandomString(64),
+ NodesEnabled: true,
+ CrawlerEnabled: true,
+ WebsiteEnabled: true,
+ LogLevel: 1,
}
-const configFilePath = "config.json"
+const configFilePath = "config.ini"
+
+var config Config
func main() {
- // Run the initialization process
err := initConfig()
if err != nil {
- fmt.Println("Error during initialization:", err)
+ printErr("Error during initialization:")
return
}
- // Start the main application
+ loadNodeConfig()
+ go startFileWatcher()
+ go checkMasterHeartbeat()
+
+ if config.AuthCode == "" {
+ config.AuthCode = generateStrongRandomString(64)
+ printInfo("Generated connection code: %s\n", config.AuthCode)
+ saveConfig(config)
+ }
+
+ // Generate Host ID
+ hostID, nodeErr := generateHostID()
+ if nodeErr != nil {
+ printErr("Failed to generate host ID: %v", nodeErr)
+ }
+ config.PeerID = hostID
+
+ if len(config.Peers) > 0 {
+ time.Sleep(2 * time.Second) // Give some time for connections to establish
+ startElection()
+ }
+
+ go startNodeClient()
+
runServer()
}
-
-func initConfig() error {
- if _, err := os.Stat(configFilePath); os.IsNotExist(err) {
- return createConfig()
- }
-
- fmt.Println("Configuration file already exists.")
- return nil
-}
-
-func createConfig() error {
- reader := bufio.NewReader(os.Stdin)
-
- fmt.Println("Configuration file not found.")
- fmt.Print("Do you want to use default values? (yes/no): ")
- useDefaults, _ := reader.ReadString('\n')
-
- config := defaultConfig
- if useDefaults != "yes\n" {
- fmt.Print("Enter port (default 5000): ")
- portStr, _ := reader.ReadString('\n')
- if portStr != "\n" {
- port, err := strconv.Atoi(portStr[:len(portStr)-1])
- if err != nil {
- return err
- }
- config.Port = port
- }
-
- fmt.Print("Enter your domain address (e.g., domain.com): ")
- domain, _ := reader.ReadString('\n')
- if domain != "\n" {
- config.OpenSearch.Domain = domain[:len(domain)-1]
- }
- }
-
- saveConfig(config)
- return nil
-}
-
-func saveConfig(config Config) {
- file, err := os.Create(configFilePath)
- if err != nil {
- fmt.Println("Error creating config file:", err)
- return
- }
- defer file.Close()
-
- configData, err := json.MarshalIndent(config, "", " ")
- if err != nil {
- fmt.Println("Error marshalling config data:", err)
- return
- }
-
- _, err = file.Write(configData)
- if err != nil {
- fmt.Println("Error writing to config file:", err)
- }
-}
-
-func loadConfig() Config {
- configFile, err := os.Open(configFilePath)
- if err != nil {
- log.Fatalf("Error opening config file: %v", err)
- }
- defer configFile.Close()
-
- var config Config
- if err := json.NewDecoder(configFile).Decode(&config); err != nil {
- log.Fatalf("Error decoding config file: %v", err)
- }
-
- return config
-}
diff --git a/main.go b/main.go
old mode 100644
new mode 100755
index a463ba6..65c272b
--- a/main.go
+++ b/main.go
@@ -1,143 +1,188 @@
-package main
-
-import (
- "fmt"
- "log"
- "net/http"
- "strconv"
-)
-
-// LanguageOption represents a language option for search
-type LanguageOption struct {
- Code string
- Name string
-}
-
-var languageOptions = []LanguageOption{
- {Code: "", Name: "Any Language"},
- {Code: "lang_en", Name: "English"},
- {Code: "lang_af", Name: "Afrikaans"},
- {Code: "lang_ar", Name: "العربية (Arabic)"},
- {Code: "lang_hy", Name: "Հայերեն (Armenian)"},
- {Code: "lang_be", Name: "Беларуская (Belarusian)"},
- {Code: "lang_bg", Name: "български (Bulgarian)"},
- {Code: "lang_ca", Name: "Català (Catalan)"},
- {Code: "lang_zh-CN", Name: "中文 (简体) (Chinese Simplified)"},
- {Code: "lang_zh-TW", Name: "中文 (繁體) (Chinese Traditional)"},
- {Code: "lang_hr", Name: "Hrvatski (Croatian)"},
- {Code: "lang_cs", Name: "Čeština (Czech)"},
- {Code: "lang_da", Name: "Dansk (Danish)"},
- {Code: "lang_nl", Name: "Nederlands (Dutch)"},
- {Code: "lang_eo", Name: "Esperanto"},
- {Code: "lang_et", Name: "Eesti (Estonian)"},
- {Code: "lang_tl", Name: "Filipino (Tagalog)"},
- {Code: "lang_fi", Name: "Suomi (Finnish)"},
- {Code: "lang_fr", Name: "Français (French)"},
- {Code: "lang_de", Name: "Deutsch (German)"},
- {Code: "lang_el", Name: "Ελληνικά (Greek)"},
- {Code: "lang_iw", Name: "עברית (Hebrew)"},
- {Code: "lang_hi", Name: "हिन्दी (Hindi)"},
- {Code: "lang_hu", Name: "magyar (Hungarian)"},
- {Code: "lang_is", Name: "íslenska (Icelandic)"},
- {Code: "lang_id", Name: "Bahasa Indonesia (Indonesian)"},
- {Code: "lang_it", Name: "italiano (Italian)"},
- {Code: "lang_ja", Name: "日本語 (Japanese)"},
- {Code: "lang_ko", Name: "한국어 (Korean)"},
- {Code: "lang_lv", Name: "latviešu (Latvian)"},
- {Code: "lang_lt", Name: "lietuvių (Lithuanian)"},
- {Code: "lang_no", Name: "norsk (Norwegian)"},
- {Code: "lang_fa", Name: "فارسی (Persian)"},
- {Code: "lang_pl", Name: "polski (Polish)"},
- {Code: "lang_pt", Name: "português (Portuguese)"},
- {Code: "lang_ro", Name: "română (Romanian)"},
- {Code: "lang_ru", Name: "русский (Russian)"},
- {Code: "lang_sr", Name: "српски (Serbian)"},
- {Code: "lang_sk", Name: "slovenčina (Slovak)"},
- {Code: "lang_sl", Name: "slovenščina (Slovenian)"},
- {Code: "lang_es", Name: "español (Spanish)"},
- {Code: "lang_sw", Name: "Kiswahili (Swahili)"},
- {Code: "lang_sv", Name: "svenska (Swedish)"},
- {Code: "lang_th", Name: "ไทย (Thai)"},
- {Code: "lang_tr", Name: "Türkçe (Turkish)"},
- {Code: "lang_uk", Name: "українська (Ukrainian)"},
- {Code: "lang_vi", Name: "Tiếng Việt (Vietnamese)"},
-}
-
-func handleSearch(w http.ResponseWriter, r *http.Request) {
- query, safe, lang, searchType, page := parseSearchParams(r)
-
- if query == "" {
- http.ServeFile(w, r, "templates/search.html")
- return
- }
-
- switch searchType {
- case "image":
- handleImageSearch(w, query, safe, lang, page)
- case "video":
- handleVideoSearch(w, query, safe, lang, page)
- case "map":
- handleMapSearch(w, query, safe)
- case "forum":
- handleForumsSearch(w, query, safe, lang, page)
- case "file":
- handleFileSearch(w, query, safe, lang, page)
- case "text":
- fallthrough
- default:
- HandleTextSearch(w, query, safe, lang, page)
- }
-}
-
-func parseSearchParams(r *http.Request) (query, safe, lang, searchType string, page int) {
- if r.Method == "GET" {
- query = r.URL.Query().Get("q")
- safe = r.URL.Query().Get("safe")
- lang = r.URL.Query().Get("lang")
- searchType = r.URL.Query().Get("t")
- pageStr := r.URL.Query().Get("p")
- page = parsePageParameter(pageStr)
- } else if r.Method == "POST" {
- query = r.FormValue("q")
- safe = r.FormValue("safe")
- lang = r.FormValue("lang")
- searchType = r.FormValue("t")
- pageStr := r.FormValue("p")
- page = parsePageParameter(pageStr)
- }
-
- if searchType == "" {
- searchType = "text"
- }
-
- return
-}
-
-func parsePageParameter(pageStr string) int {
- page, err := strconv.Atoi(pageStr)
- if err != nil || page < 1 {
- page = 1
- }
- return page
-}
-
-func runServer() {
- http.Handle("/static/", http.StripPrefix("/static/", http.FileServer(http.Dir("static"))))
- http.HandleFunc("/", handleSearch)
- http.HandleFunc("/search", handleSearch)
- http.HandleFunc("/img_proxy", handleImageProxy)
- http.HandleFunc("/settings", func(w http.ResponseWriter, r *http.Request) {
- http.ServeFile(w, r, "templates/settings.html")
- })
- http.HandleFunc("/opensearch.xml", func(w http.ResponseWriter, r *http.Request) {
- w.Header().Set("Content-Type", "application/opensearchdescription+xml")
- http.ServeFile(w, r, "static/opensearch.xml")
- })
- initializeTorrentSites()
-
- config := loadConfig()
- generateOpenSearchXML(config)
-
- fmt.Printf("Server is listening on http://localhost:%d\n", config.Port)
- log.Fatal(http.ListenAndServe(fmt.Sprintf(":%d", config.Port), nil))
-}
+package main
+
+import (
+ "fmt"
+ "html/template"
+ "log"
+ "net/http"
+ "strconv"
+)
+
+// LanguageOption represents a language option for search
+type LanguageOption struct {
+ Code string
+ Name string
+}
+
+var settings UserSettings
+
+var languageOptions = []LanguageOption{
+ {Code: "", Name: "Any Language"},
+ {Code: "lang_en", Name: "English"},
+ {Code: "lang_af", Name: "Afrikaans"},
+ {Code: "lang_ar", Name: "العربية (Arabic)"},
+ {Code: "lang_hy", Name: "Հայերեն (Armenian)"},
+ {Code: "lang_be", Name: "Беларуская (Belarusian)"},
+ {Code: "lang_bg", Name: "български (Bulgarian)"},
+ {Code: "lang_ca", Name: "Català (Catalan)"},
+ {Code: "lang_zh-CN", Name: "中文 (简体) (Chinese Simplified)"},
+ {Code: "lang_zh-TW", Name: "中文 (繁體) (Chinese Traditional)"},
+ {Code: "lang_hr", Name: "Hrvatski (Croatian)"},
+ {Code: "lang_cs", Name: "Čeština (Czech)"},
+ {Code: "lang_da", Name: "Dansk (Danish)"},
+ {Code: "lang_nl", Name: "Nederlands (Dutch)"},
+ {Code: "lang_eo", Name: "Esperanto"},
+ {Code: "lang_et", Name: "Eesti (Estonian)"},
+ {Code: "lang_tl", Name: "Filipino (Tagalog)"},
+ {Code: "lang_fi", Name: "Suomi (Finnish)"},
+ {Code: "lang_fr", Name: "Français (French)"},
+ {Code: "lang_de", Name: "Deutsch (German)"},
+ {Code: "lang_el", Name: "Ελληνικά (Greek)"},
+ {Code: "lang_iw", Name: "עברית (Hebrew)"},
+ {Code: "lang_hi", Name: "हिन्दी (Hindi)"},
+ {Code: "lang_hu", Name: "magyar (Hungarian)"},
+ {Code: "lang_is", Name: "íslenska (Icelandic)"},
+ {Code: "lang_id", Name: "Bahasa Indonesia (Indonesian)"},
+ {Code: "lang_it", Name: "italiano (Italian)"},
+ {Code: "lang_ja", Name: "日本語 (Japanese)"},
+ {Code: "lang_ko", Name: "한국어 (Korean)"},
+ {Code: "lang_lv", Name: "latviešu (Latvian)"},
+ {Code: "lang_lt", Name: "lietuvių (Lithuanian)"},
+ {Code: "lang_no", Name: "norsk (Norwegian)"},
+ {Code: "lang_fa", Name: "فارسی (Persian)"},
+ {Code: "lang_pl", Name: "polski (Polish)"},
+ {Code: "lang_pt", Name: "português (Portuguese)"},
+ {Code: "lang_ro", Name: "română (Romanian)"},
+ {Code: "lang_ru", Name: "русский (Russian)"},
+ {Code: "lang_sr", Name: "српски (Serbian)"},
+ {Code: "lang_sk", Name: "slovenčina (Slovak)"},
+ {Code: "lang_sl", Name: "slovenščina (Slovenian)"},
+ {Code: "lang_es", Name: "español (Spanish)"},
+ {Code: "lang_sw", Name: "Kiswahili (Swahili)"},
+ {Code: "lang_sv", Name: "svenska (Swedish)"},
+ {Code: "lang_th", Name: "ไทย (Thai)"},
+ {Code: "lang_tr", Name: "Türkçe (Turkish)"},
+ {Code: "lang_uk", Name: "українська (Ukrainian)"},
+ {Code: "lang_vi", Name: "Tiếng Việt (Vietnamese)"},
+}
+
+func handleSearch(w http.ResponseWriter, r *http.Request) {
+ query, safe, lang, searchType, page := parseSearchParams(r)
+
+ // Load user settings
+ settings = loadUserSettings(r)
+
+ // Update the theme, safe search, and language based on query parameters or use existing settings
+ theme := r.URL.Query().Get("theme")
+ if theme != "" {
+ settings.Theme = theme
+ saveUserSettings(w, settings)
+ } else if settings.Theme == "" {
+ settings.Theme = "dark" // Default theme
+ }
+
+ if safe != "" {
+ settings.SafeSearch = safe
+ saveUserSettings(w, settings)
+ }
+
+ if lang != "" {
+ settings.Language = lang
+ saveUserSettings(w, settings)
+ }
+
+ // Render the search page template if no query
+
+ data := struct {
+ LanguageOptions []LanguageOption
+ CurrentLang string
+ Theme string
+ Safe string
+ }{
+ LanguageOptions: languageOptions,
+ CurrentLang: settings.Language,
+ Theme: settings.Theme,
+ Safe: settings.SafeSearch,
+ }
+ if query == "" {
+ tmpl := template.Must(template.ParseFiles("templates/search.html"))
+ tmpl.Execute(w, data)
+ return
+ }
+
+ settings := loadUserSettings(r)
+
+ // Handle search based on the type
+ switch searchType {
+ case "image":
+ handleImageSearch(w, settings, query, page)
+ case "video":
+ handleVideoSearch(w, settings, query, page)
+ case "map":
+ handleMapSearch(w, settings, query)
+ case "forum":
+ handleForumsSearch(w, settings, query, page)
+ case "file":
+ handleFileSearch(w, settings, query, page)
+ case "text":
+ fallthrough
+ default:
+ HandleTextSearch(w, settings, query, page)
+ }
+}
+
+func parseSearchParams(r *http.Request) (query, safe, lang, searchType string, page int) {
+ if r.Method == "GET" {
+ query = r.URL.Query().Get("q")
+ safe = r.URL.Query().Get("safe")
+ lang = r.URL.Query().Get("lang")
+ searchType = r.URL.Query().Get("t")
+ pageStr := r.URL.Query().Get("p")
+ page = parsePageParameter(pageStr)
+ } else if r.Method == "POST" {
+ query = r.FormValue("q")
+ safe = r.FormValue("safe")
+ lang = r.FormValue("lang")
+ searchType = r.FormValue("t")
+ pageStr := r.FormValue("p")
+ page = parsePageParameter(pageStr)
+ }
+
+ if searchType == "" {
+ searchType = "text"
+ }
+
+ return
+}
+
+func parsePageParameter(pageStr string) int {
+ page, err := strconv.Atoi(pageStr)
+ if err != nil || page < 1 {
+ page = 1
+ }
+ return page
+}
+
+func runServer() {
+ http.Handle("/static/", http.StripPrefix("/static/", http.FileServer(http.Dir("static"))))
+ http.HandleFunc("/", handleSearch)
+ http.HandleFunc("/search", handleSearch)
+ http.HandleFunc("/img_proxy", handleImageProxy)
+ http.HandleFunc("/node", handleNodeRequest)
+ http.HandleFunc("/settings", handleSettings)
+ http.HandleFunc("/save-settings", handleSaveSettings)
+ http.HandleFunc("/opensearch.xml", func(w http.ResponseWriter, r *http.Request) {
+ w.Header().Set("Content-Type", "application/opensearchdescription+xml")
+ http.ServeFile(w, r, "static/opensearch.xml")
+ })
+ initializeTorrentSites()
+
+ config := loadConfig()
+ generateOpenSearchXML(config)
+
+ printMessage("Server is listening on http://localhost:%d", config.Port)
+ log.Fatal(http.ListenAndServe(fmt.Sprintf(":%d", config.Port), nil))
+
+ // Start automatic update checker
+ go checkForUpdates()
+}
diff --git a/map.go b/map.go
old mode 100644
new mode 100755
index 7ad21ff..df3c7e8
--- a/map.go
+++ b/map.go
@@ -1,72 +1,73 @@
-package main
-
-import (
- "encoding/json"
- "fmt"
- "html/template"
- "log"
- "net/http"
- "net/url"
-)
-
-type NominatimResponse struct {
- Lat string `json:"lat"`
- Lon string `json:"lon"`
-}
-
-func geocodeQuery(query string) (latitude, longitude string, found bool, err error) {
- // URL encode the query
- query = url.QueryEscape(query)
-
- // Construct the request URL
- urlString := fmt.Sprintf("https://nominatim.openstreetmap.org/search?format=json&q=%s", query)
-
- // Make the HTTP GET request
- resp, err := http.Get(urlString)
- if err != nil {
- return "", "", false, err
- }
- defer resp.Body.Close()
-
- // Read the response
- var result []NominatimResponse
- if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
- return "", "", false, err
- }
-
- // Check if there are any results
- if len(result) > 0 {
- latitude = result[0].Lat
- longitude = result[0].Lon
- return latitude, longitude, true, nil
- }
-
- return "", "", false, nil
-}
-
-func handleMapSearch(w http.ResponseWriter, query string, lang string) {
- // Geocode the query to get coordinates
- latitude, longitude, found, err := geocodeQuery(query)
- if err != nil {
- log.Printf("Error geocoding query: %s, error: %v", query, err)
- http.Error(w, "Failed to find location", http.StatusInternalServerError)
- return
- }
-
- // Use a template to serve the map page
- data := map[string]interface{}{
- "Query": query,
- "Latitude": latitude,
- "Longitude": longitude,
- "Found": found,
- }
-
- tmpl, err := template.ParseFiles("templates/map.html")
- if err != nil {
- log.Printf("Error loading map template: %v", err)
- http.Error(w, "Internal Server Error", http.StatusInternalServerError)
- return
- }
-
- tmpl.Execute(w, data)
-}
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "html/template"
+ "net/http"
+ "net/url"
+)
+
+type NominatimResponse struct {
+ Lat string `json:"lat"`
+ Lon string `json:"lon"`
+}
+
+func geocodeQuery(query string) (latitude, longitude string, found bool, err error) {
+ // URL encode the query
+ query = url.QueryEscape(query)
+
+ // Construct the request URL
+ urlString := fmt.Sprintf("https://nominatim.openstreetmap.org/search?format=json&q=%s", query)
+
+ // Make the HTTP GET request
+ resp, err := http.Get(urlString)
+ if err != nil {
+ return "", "", false, err
+ }
+ defer resp.Body.Close()
+
+ // Read the response
+ var result []NominatimResponse
+ if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
+ return "", "", false, err
+ }
+
+ // Check if there are any results
+ if len(result) > 0 {
+ latitude = result[0].Lat
+ longitude = result[0].Lon
+ return latitude, longitude, true, nil
+ }
+
+ return "", "", false, nil
+}
+
+func handleMapSearch(w http.ResponseWriter, settings UserSettings, query string) {
+ // Geocode the query to get coordinates
+ latitude, longitude, found, err := geocodeQuery(query)
+ if err != nil {
+ printDebug("Error geocoding query: %s, error: %v", query, err)
+ http.Error(w, "Failed to find location", http.StatusInternalServerError)
+ return
+ }
+
+ // Use a template to serve the map page
+ data := map[string]interface{}{
+ "Query": query,
+ "Latitude": latitude,
+ "Longitude": longitude,
+ "Found": found,
+ "Theme": settings.Theme,
+ "Safe": settings.SafeSearch,
+ }
+
+ tmpl, err := template.ParseFiles("templates/map.html")
+ if err != nil {
+ printErr("Error loading map template: %v", err)
+ http.Error(w, "Internal Server Error", http.StatusInternalServerError)
+ return
+ }
+
+ tmpl.Execute(w, data)
+}
diff --git a/node-handle-search.go b/node-handle-search.go
new file mode 100755
index 0000000..8591b65
--- /dev/null
+++ b/node-handle-search.go
@@ -0,0 +1,219 @@
+package main
+
+import (
+ "encoding/json"
+ "log"
+)
+
+func handleSearchTextMessage(msg Message) {
+ var searchParams struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ }
+ err := json.Unmarshal([]byte(msg.Content), &searchParams)
+ if err != nil {
+ printWarn("Error parsing search parameters: %v", err)
+ return
+ }
+
+ printDebug("Received search-text request. ResponseAddr: %s", searchParams.ResponseAddr)
+
+ results := fetchTextResults(searchParams.Query, searchParams.Safe, searchParams.Lang, searchParams.Page)
+ resultsJSON, err := json.Marshal(results)
+ if err != nil {
+ printWarn("Error marshalling search results: %v", err)
+ return
+ }
+
+ responseMsg := Message{
+ ID: hostID,
+ Type: "text-results",
+ Content: string(resultsJSON),
+ }
+
+ // Log the address to be used for sending the response
+ printDebug("Sending text search results to %s", searchParams.ResponseAddr)
+
+ if searchParams.ResponseAddr == "" {
+ printErr("Error: Response address is empty")
+ return
+ }
+
+ err = sendMessage(searchParams.ResponseAddr, responseMsg)
+ if err != nil {
+ printWarn("Error sending text search results to %s: %v", searchParams.ResponseAddr, err)
+ }
+}
+
+func handleSearchImageMessage(msg Message) {
+ var searchParams struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ }
+ err := json.Unmarshal([]byte(msg.Content), &searchParams)
+ if err != nil {
+ log.Printf("Error parsing search parameters: %v", err)
+ return
+ }
+
+ log.Printf("Received search-image request. ResponseAddr: %s", searchParams.ResponseAddr)
+
+ results := fetchImageResults(searchParams.Query, searchParams.Safe, searchParams.Lang, searchParams.Page)
+ resultsJSON, err := json.Marshal(results)
+ if err != nil {
+ log.Printf("Error marshalling search results: %v", err)
+ return
+ }
+
+ responseMsg := Message{
+ ID: hostID,
+ Type: "image-results",
+ Content: string(resultsJSON),
+ }
+
+ // Log the address to be used for sending the response
+ log.Printf("Sending image search results to %s", searchParams.ResponseAddr)
+
+ if searchParams.ResponseAddr == "" {
+ log.Printf("Error: Response address is empty")
+ return
+ }
+
+ err = sendMessage(searchParams.ResponseAddr, responseMsg)
+ if err != nil {
+ log.Printf("Error sending image search results to %s: %v", searchParams.ResponseAddr, err)
+ }
+}
+
+func handleSearchVideoMessage(msg Message) {
+ var searchParams struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ }
+ err := json.Unmarshal([]byte(msg.Content), &searchParams)
+ if err != nil {
+ log.Printf("Error parsing search parameters: %v", err)
+ return
+ }
+
+ log.Printf("Received search-video request. ResponseAddr: %s", searchParams.ResponseAddr)
+
+ results := fetchVideoResults(searchParams.Query, searchParams.Safe, searchParams.Lang, searchParams.Page)
+ resultsJSON, err := json.Marshal(results)
+ if err != nil {
+ log.Printf("Error marshalling search results: %v", err)
+ return
+ }
+
+ responseMsg := Message{
+ ID: hostID,
+ Type: "video-results",
+ Content: string(resultsJSON),
+ }
+
+ log.Printf("Sending video search results to %s", searchParams.ResponseAddr)
+
+ if searchParams.ResponseAddr == "" {
+ log.Printf("Error: Response address is empty")
+ return
+ }
+
+ err = sendMessage(searchParams.ResponseAddr, responseMsg)
+ if err != nil {
+ log.Printf("Error sending video search results to %s: %v", searchParams.ResponseAddr, err)
+ }
+}
+
+func handleSearchFileMessage(msg Message) {
+ var searchParams struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ }
+ err := json.Unmarshal([]byte(msg.Content), &searchParams)
+ if err != nil {
+ log.Printf("Error parsing search parameters: %v", err)
+ return
+ }
+
+ log.Printf("Received search-file request. ResponseAddr: %s", searchParams.ResponseAddr)
+
+ results := fetchFileResults(searchParams.Query, searchParams.Safe, searchParams.Lang, searchParams.Page)
+ resultsJSON, err := json.Marshal(results)
+ if err != nil {
+ log.Printf("Error marshalling search results: %v", err)
+ return
+ }
+
+ responseMsg := Message{
+ ID: hostID,
+ Type: "file-results",
+ Content: string(resultsJSON),
+ }
+
+ log.Printf("Sending file search results to %s", searchParams.ResponseAddr)
+
+ if searchParams.ResponseAddr == "" {
+ log.Printf("Error: Response address is empty")
+ return
+ }
+
+ err = sendMessage(searchParams.ResponseAddr, responseMsg)
+ if err != nil {
+ log.Printf("Error sending file search results to %s: %v", searchParams.ResponseAddr, err)
+ }
+}
+
+func handleSearchForumMessage(msg Message) {
+ var searchParams struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ }
+ err := json.Unmarshal([]byte(msg.Content), &searchParams)
+ if err != nil {
+ log.Printf("Error parsing search parameters: %v", err)
+ return
+ }
+
+ log.Printf("Received search-forum request. ResponseAddr: %s", searchParams.ResponseAddr)
+
+ results := fetchForumResults(searchParams.Query, searchParams.Safe, searchParams.Lang, searchParams.Page)
+ resultsJSON, err := json.Marshal(results)
+ if err != nil {
+ log.Printf("Error marshalling search results: %v", err)
+ return
+ }
+
+ responseMsg := Message{
+ ID: hostID,
+ Type: "forum-results",
+ Content: string(resultsJSON),
+ }
+
+ // Log the address to be used for sending the response
+ log.Printf("Sending forum search results to %s", searchParams.ResponseAddr)
+
+ if searchParams.ResponseAddr == "" {
+ log.Printf("Error: Response address is empty")
+ return
+ }
+
+ err = sendMessage(searchParams.ResponseAddr, responseMsg)
+ if err != nil {
+ log.Printf("Error sending forum search results to %s: %v", searchParams.ResponseAddr, err)
+ }
+}
diff --git a/node-master.go b/node-master.go
new file mode 100644
index 0000000..133f72e
--- /dev/null
+++ b/node-master.go
@@ -0,0 +1,91 @@
+package main
+
+import (
+ "log"
+ "sync"
+ "time"
+)
+
+var (
+ isMaster bool
+ masterNode string
+ masterNodeMux sync.RWMutex
+)
+
+const (
+ heartbeatInterval = 5 * time.Second
+ heartbeatTimeout = 15 * time.Second
+ electionTimeout = 10 * time.Second
+)
+
+func sendHeartbeats() {
+ for {
+ if !isMaster {
+ return
+ }
+ for _, node := range peers {
+ msg := Message{
+ ID: hostID,
+ Type: "heartbeat",
+ Content: authCode,
+ }
+ err := sendMessage(node, msg)
+ if err != nil {
+ log.Printf("Error sending heartbeat to %s: %v", node, err)
+ }
+ }
+ time.Sleep(heartbeatInterval)
+ }
+}
+
+func checkMasterHeartbeat() {
+ for {
+ time.Sleep(heartbeatTimeout)
+ masterNodeMux.RLock()
+ if masterNode == authCode || masterNode == "" {
+ masterNodeMux.RUnlock()
+ continue
+ }
+ masterNodeMux.RUnlock()
+
+ masterNodeMux.Lock()
+ masterNode = ""
+ masterNodeMux.Unlock()
+ startElection()
+ }
+}
+
+func startElection() {
+ masterNodeMux.Lock()
+ defer masterNodeMux.Unlock()
+
+ for _, node := range peers {
+ msg := Message{
+ ID: hostID,
+ Type: "election",
+ Content: authCode,
+ }
+ err := sendMessage(node, msg)
+ if err != nil {
+ log.Printf("Error sending election message to %s: %v", node, err)
+ }
+ }
+
+ isMaster = true
+ go sendHeartbeats()
+}
+
+func handleHeartbeat(content string) {
+ masterNodeMux.Lock()
+ defer masterNodeMux.Unlock()
+ masterNode = content
+}
+
+func handleElection(content string) {
+ masterNodeMux.Lock()
+ defer masterNodeMux.Unlock()
+
+ if content < authCode {
+ masterNode = content
+ }
+}
diff --git a/node-request-files.go b/node-request-files.go
new file mode 100755
index 0000000..0cabf32
--- /dev/null
+++ b/node-request-files.go
@@ -0,0 +1,82 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "time"
+)
+
+func tryOtherNodesForFileSearch(query, safe, lang string, page int, visitedNodes []string) []TorrentResult {
+ for _, nodeAddr := range peers {
+ if contains(visitedNodes, nodeAddr) {
+ continue // Skip nodes already visited
+ }
+ results, err := sendFileSearchRequestToNode(nodeAddr, query, safe, lang, page, visitedNodes)
+ if err != nil {
+ printWarn("Error contacting node %s: %v", nodeAddr, err)
+ continue
+ }
+ if len(results) > 0 {
+ return results
+ }
+ }
+ return nil
+}
+
+func sendFileSearchRequestToNode(nodeAddr, query, safe, lang string, page int, visitedNodes []string) ([]TorrentResult, error) {
+ visitedNodes = append(visitedNodes, nodeAddr)
+ searchParams := struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ VisitedNodes []string `json:"visitedNodes"`
+ }{
+ Query: query,
+ Safe: safe,
+ Lang: lang,
+ Page: page,
+ ResponseAddr: fmt.Sprintf("http://localhost:%d/node", config.Port),
+ VisitedNodes: visitedNodes,
+ }
+
+ msgBytes, err := json.Marshal(searchParams)
+ if err != nil {
+ return nil, fmt.Errorf("failed to marshal search parameters: %v", err)
+ }
+
+ msg := Message{
+ ID: hostID,
+ Type: "search-file",
+ Content: string(msgBytes),
+ }
+
+ err = sendMessage(nodeAddr, msg)
+ if err != nil {
+ return nil, fmt.Errorf("failed to send search request to node %s: %v", nodeAddr, err)
+ }
+
+ // Wait for results
+ select {
+ case res := <-fileResultsChan:
+ return res, nil
+ case <-time.After(20 * time.Second):
+ return nil, fmt.Errorf("timeout waiting for results from node %s", nodeAddr)
+ }
+}
+
+func handleFileResultsMessage(msg Message) {
+ var results []TorrentResult
+ err := json.Unmarshal([]byte(msg.Content), &results)
+ if err != nil {
+ printWarn("Error unmarshalling file results: %v", err)
+ return
+ }
+
+ printDebug("Received file results: %+v", results)
+ // Send results to fileResultsChan
+ go func() {
+ fileResultsChan <- results
+ }()
+}
diff --git a/node-request-forums.go b/node-request-forums.go
new file mode 100755
index 0000000..ff6ed2e
--- /dev/null
+++ b/node-request-forums.go
@@ -0,0 +1,100 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "time"
+)
+
+var forumResultsChan = make(chan []ForumSearchResult)
+
+func tryOtherNodesForForumSearch(query, safe, lang string, page int) []ForumSearchResult {
+ for _, nodeAddr := range peers {
+ results, err := sendForumSearchRequestToNode(nodeAddr, query, safe, lang, page, []string{})
+ if err != nil {
+ printWarn("Error contacting node %s: %v", nodeAddr, err)
+ continue
+ }
+ if len(results) > 0 {
+ return results
+ }
+ }
+ return nil
+}
+
+func sendForumSearchRequestToNode(nodeAddr, query, safe, lang string, page int, visitedNodes []string) ([]ForumSearchResult, error) {
+ // Check if the current node has already been visited
+ for _, node := range visitedNodes {
+ if node == hostID {
+ return nil, fmt.Errorf("loop detected: this node (%s) has already been visited", hostID)
+ }
+ }
+
+ // Add current node to the list of visited nodes
+ visitedNodes = append(visitedNodes, hostID)
+
+ searchParams := struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ VisitedNodes []string `json:"visitedNodes"`
+ }{
+ Query: query,
+ Safe: safe,
+ Lang: lang,
+ Page: page,
+ ResponseAddr: fmt.Sprintf("http://localhost:%d/node", config.Port),
+ VisitedNodes: visitedNodes,
+ }
+
+ msgBytes, err := json.Marshal(searchParams)
+ if err != nil {
+ return nil, fmt.Errorf("failed to marshal search parameters: %v", err)
+ }
+
+ msg := Message{
+ ID: hostID,
+ Type: "search-forum",
+ Content: string(msgBytes),
+ }
+
+ err = sendMessage(nodeAddr, msg)
+ if err != nil {
+ return nil, fmt.Errorf("failed to send search request to node %s: %v", nodeAddr, err)
+ }
+
+ // Wait for results
+ select {
+ case res := <-forumResultsChan:
+ return res, nil
+ case <-time.After(20 * time.Second):
+ return nil, fmt.Errorf("timeout waiting for results from node %s", nodeAddr)
+ }
+}
+
+func handleForumResultsMessage(msg Message) {
+ var results []ForumSearchResult
+ err := json.Unmarshal([]byte(msg.Content), &results)
+ if err != nil {
+ printWarn("Error unmarshalling forum results: %v", err)
+ return
+ }
+
+ printDebug("Received forum results: %+v", results)
+ // Send results to forumResultsChan
+ go func() {
+ forumResultsChan <- results
+ }()
+}
+
+// Used only to answer requests
+func fetchForumResults(query, safe, lang string, page int) []ForumSearchResult {
+ results, err := PerformRedditSearch(query, safe, page)
+ if err != nil {
+ printWarn("Error fetching forum results: %v", err)
+ return nil
+ }
+ return results
+}
diff --git a/node-request-images.go b/node-request-images.go
new file mode 100755
index 0000000..4e3e9e3
--- /dev/null
+++ b/node-request-images.go
@@ -0,0 +1,84 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "time"
+)
+
+var imageResultsChan = make(chan []ImageSearchResult)
+
+func handleImageResultsMessage(msg Message) {
+ var results []ImageSearchResult
+ err := json.Unmarshal([]byte(msg.Content), &results)
+ if err != nil {
+ printWarn("Error unmarshalling image results: %v", err)
+ return
+ }
+
+ printDebug("Received image results: %+v", results)
+ // Send results to imageResultsChan
+ go func() {
+ imageResultsChan <- results
+ }()
+}
+
+func sendImageSearchRequestToNode(nodeAddr, query, safe, lang string, page int, visitedNodes []string) ([]ImageSearchResult, error) {
+ visitedNodes = append(visitedNodes, nodeAddr)
+ searchParams := struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ VisitedNodes []string `json:"visitedNodes"`
+ }{
+ Query: query,
+ Safe: safe,
+ Lang: lang,
+ Page: page,
+ ResponseAddr: fmt.Sprintf("http://localhost:%d/node", config.Port),
+ VisitedNodes: visitedNodes,
+ }
+
+ msgBytes, err := json.Marshal(searchParams)
+ if err != nil {
+ return nil, fmt.Errorf("failed to marshal search parameters: %v", err)
+ }
+
+ msg := Message{
+ ID: hostID,
+ Type: "search-image",
+ Content: string(msgBytes),
+ }
+
+ err = sendMessage(nodeAddr, msg)
+ if err != nil {
+ return nil, fmt.Errorf("failed to send search request to node %s: %v", nodeAddr, err)
+ }
+
+ // Wait for results
+ select {
+ case res := <-imageResultsChan:
+ return res, nil
+ case <-time.After(30 * time.Second):
+ return nil, fmt.Errorf("timeout waiting for results from node %s", nodeAddr)
+ }
+}
+
+func tryOtherNodesForImageSearch(query, safe, lang string, page int, visitedNodes []string) []ImageSearchResult {
+ for _, nodeAddr := range peers {
+ if contains(visitedNodes, nodeAddr) {
+ continue // Skip nodes already visited
+ }
+ results, err := sendImageSearchRequestToNode(nodeAddr, query, safe, lang, page, visitedNodes)
+ if err != nil {
+ printWarn("Error contacting node %s: %v", nodeAddr, err)
+ continue
+ }
+ if len(results) > 0 {
+ return results
+ }
+ }
+ return nil
+}
diff --git a/node-request-text.go b/node-request-text.go
new file mode 100755
index 0000000..ebe6041
--- /dev/null
+++ b/node-request-text.go
@@ -0,0 +1,84 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "time"
+)
+
+var textResultsChan = make(chan []TextSearchResult)
+
+func tryOtherNodesForTextSearch(query, safe, lang string, page int, visitedNodes []string) []TextSearchResult {
+ for _, nodeAddr := range peers {
+ if contains(visitedNodes, nodeAddr) {
+ continue // Skip nodes already visited
+ }
+ results, err := sendTextSearchRequestToNode(nodeAddr, query, safe, lang, page, visitedNodes)
+ if err != nil {
+ printWarn("Error contacting node %s: %v", nodeAddr, err)
+ continue
+ }
+ if len(results) > 0 {
+ return results
+ }
+ }
+ return nil
+}
+
+func sendTextSearchRequestToNode(nodeAddr, query, safe, lang string, page int, visitedNodes []string) ([]TextSearchResult, error) {
+ visitedNodes = append(visitedNodes, nodeAddr)
+ searchParams := struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ VisitedNodes []string `json:"visitedNodes"`
+ }{
+ Query: query,
+ Safe: safe,
+ Lang: lang,
+ Page: page,
+ ResponseAddr: fmt.Sprintf("http://localhost:%d/node", config.Port),
+ VisitedNodes: visitedNodes,
+ }
+
+ msgBytes, err := json.Marshal(searchParams)
+ if err != nil {
+ return nil, fmt.Errorf("failed to marshal search parameters: %v", err)
+ }
+
+ msg := Message{
+ ID: hostID,
+ Type: "search-text",
+ Content: string(msgBytes),
+ }
+
+ err = sendMessage(nodeAddr, msg)
+ if err != nil {
+ return nil, fmt.Errorf("failed to send search request to node %s: %v", nodeAddr, err)
+ }
+
+ // Wait for results
+ select {
+ case res := <-textResultsChan:
+ return res, nil
+ case <-time.After(20 * time.Second):
+ return nil, fmt.Errorf("timeout waiting for results from node %s", nodeAddr)
+ }
+}
+
+func handleTextResultsMessage(msg Message) {
+ var results []TextSearchResult
+ err := json.Unmarshal([]byte(msg.Content), &results)
+ if err != nil {
+ printWarn("Error unmarshalling text results: %v", err)
+ return
+ }
+
+ printDebug("Received text results: %+v", results)
+ // Send results to textResultsChan
+ go func() {
+ textResultsChan <- results
+ }()
+}
diff --git a/node-request-video.go b/node-request-video.go
new file mode 100755
index 0000000..d965a7d
--- /dev/null
+++ b/node-request-video.go
@@ -0,0 +1,82 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "time"
+)
+
+func tryOtherNodesForVideoSearch(query, safe, lang string, page int, visitedNodes []string) []VideoResult {
+ for _, nodeAddr := range peers {
+ if contains(visitedNodes, nodeAddr) {
+ continue // Skip nodes already visited
+ }
+ results, err := sendVideoSearchRequestToNode(nodeAddr, query, safe, lang, page, visitedNodes)
+ if err != nil {
+ printWarn("Error contacting node %s: %v", nodeAddr, err)
+ continue
+ }
+ if len(results) > 0 {
+ return results
+ }
+ }
+ return nil
+}
+
+func sendVideoSearchRequestToNode(nodeAddr, query, safe, lang string, page int, visitedNodes []string) ([]VideoResult, error) {
+ visitedNodes = append(visitedNodes, nodeAddr)
+ searchParams := struct {
+ Query string `json:"query"`
+ Safe string `json:"safe"`
+ Lang string `json:"lang"`
+ Page int `json:"page"`
+ ResponseAddr string `json:"responseAddr"`
+ VisitedNodes []string `json:"visitedNodes"`
+ }{
+ Query: query,
+ Safe: safe,
+ Lang: lang,
+ Page: page,
+ ResponseAddr: fmt.Sprintf("http://localhost:%d/node", config.Port),
+ VisitedNodes: visitedNodes,
+ }
+
+ msgBytes, err := json.Marshal(searchParams)
+ if err != nil {
+ return nil, fmt.Errorf("failed to marshal search parameters: %v", err)
+ }
+
+ msg := Message{
+ ID: hostID,
+ Type: "search-video",
+ Content: string(msgBytes),
+ }
+
+ err = sendMessage(nodeAddr, msg)
+ if err != nil {
+ return nil, fmt.Errorf("failed to send search request to node %s: %v", nodeAddr, err)
+ }
+
+ // Wait for results
+ select {
+ case res := <-videoResultsChan:
+ return res, nil
+ case <-time.After(20 * time.Second):
+ return nil, fmt.Errorf("timeout waiting for results from node %s", nodeAddr)
+ }
+}
+
+func handleVideoResultsMessage(msg Message) {
+ var results []VideoResult
+ err := json.Unmarshal([]byte(msg.Content), &results)
+ if err != nil {
+ printWarn("Error unmarshalling video results: %v", err)
+ return
+ }
+
+ printDebug("Received video results: %+v", results)
+ // Send results to videoResultsChan
+ go func() {
+ videoResultsChan <- results
+ }()
+}
diff --git a/node-update.go b/node-update.go
new file mode 100644
index 0000000..f433eb4
--- /dev/null
+++ b/node-update.go
@@ -0,0 +1,28 @@
+package main
+
+import (
+ "fmt"
+ "log"
+ "time"
+)
+
+// Function to sync updates across all nodes
+func nodeUpdateSync() {
+ fmt.Println("Syncing updates across all nodes...")
+ for _, peerAddr := range peers {
+ fmt.Printf("Notifying node %s about update...\n", peerAddr)
+ msg := Message{
+ ID: hostID,
+ Type: "update",
+ Content: "Start update process",
+ }
+ err := sendMessage(peerAddr, msg)
+ if err != nil {
+ log.Printf("Failed to notify node %s: %v\n", peerAddr, err)
+ continue
+ }
+ fmt.Printf("Node %s notified. Waiting for it to update...\n", peerAddr)
+ time.Sleep(30 * time.Second) // Adjust sleep time as needed to allow for updates
+ }
+ fmt.Println("All nodes have been updated.")
+}
diff --git a/node.go b/node.go
new file mode 100644
index 0000000..43960a3
--- /dev/null
+++ b/node.go
@@ -0,0 +1,156 @@
+package main
+
+import (
+ "bytes"
+ "crypto/rand"
+ "encoding/json"
+ "fmt"
+ "io/ioutil"
+ "net/http"
+ "time"
+)
+
+var (
+ authCode string
+ peers []string
+ hostID string
+)
+
+type Message struct {
+ ID string `json:"id"`
+ Type string `json:"type"`
+ Content string `json:"content"`
+ VisitedNodes []string `json:"visitedNodes"`
+}
+
+func loadNodeConfig() {
+ config := loadConfig()
+ authCode = config.AuthCode
+ peers = config.Peers
+}
+
+func generateHostID() (string, error) {
+ bytes := make([]byte, 16)
+ _, err := rand.Read(bytes)
+ if err != nil {
+ return "", fmt.Errorf("failed to generate host ID: %v", err)
+ }
+ return fmt.Sprintf("%x", bytes), nil
+}
+
+func sendMessage(serverAddr string, msg Message) error {
+ if serverAddr == "" {
+ return fmt.Errorf("server address is empty")
+ }
+
+ msgBytes, err := json.Marshal(msg)
+ if err != nil {
+ return fmt.Errorf("failed to marshal message: %v", err)
+ }
+
+ req, err := http.NewRequest("POST", serverAddr, bytes.NewBuffer(msgBytes))
+ if err != nil {
+ return fmt.Errorf("failed to create request: %v", err)
+ }
+ req.Header.Set("Content-Type", "application/json")
+ req.Header.Set("Authorization", authCode)
+
+ client := &http.Client{
+ Timeout: time.Second * 10,
+ }
+
+ resp, err := client.Do(req)
+ if err != nil {
+ return fmt.Errorf("failed to send request: %v", err)
+ }
+ defer resp.Body.Close()
+
+ if resp.StatusCode != http.StatusOK {
+ body, _ := ioutil.ReadAll(resp.Body)
+ return fmt.Errorf("server error: %s", body)
+ }
+
+ return nil
+}
+
+func handleNodeRequest(w http.ResponseWriter, r *http.Request) {
+ if r.Method != http.MethodPost {
+ http.Error(w, "Invalid request method", http.StatusMethodNotAllowed)
+ return
+ }
+
+ auth := r.Header.Get("Authorization")
+ if auth != authCode {
+ http.Error(w, "Unauthorized", http.StatusUnauthorized)
+ return
+ }
+
+ var msg Message
+ err := json.NewDecoder(r.Body).Decode(&msg)
+ if err != nil {
+ http.Error(w, "Error parsing JSON", http.StatusBadRequest)
+ return
+ }
+ defer r.Body.Close()
+
+ printDebug("Received message: %+v\n", msg)
+ w.Write([]byte("Message received"))
+
+ interpretMessage(msg)
+}
+
+func startNodeClient() {
+ for {
+ for _, peerAddr := range peers {
+ msg := Message{
+ ID: hostID,
+ Type: "test",
+ Content: "This is a test message from the client node",
+ }
+
+ err := sendMessage(peerAddr, msg)
+ if err != nil {
+ printWarn("Error sending message to %s: %v", peerAddr, err)
+ } else {
+ printInfo("Message sent successfully to: %s", peerAddr)
+ }
+ }
+ time.Sleep(10 * time.Second)
+ }
+}
+
+func interpretMessage(msg Message) {
+ switch msg.Type {
+ case "test":
+ printDebug("Received test message: %v", msg.Content)
+ case "update":
+ printDebug("Received update message: %v", msg.Content)
+ go update()
+ case "heartbeat":
+ handleHeartbeat(msg.Content)
+ case "election":
+ handleElection(msg.Content)
+ case "search-text":
+ handleSearchTextMessage(msg)
+ case "search-image":
+ handleSearchImageMessage(msg)
+ case "search-video":
+ handleSearchVideoMessage(msg)
+ case "search-file":
+ handleSearchFileMessage(msg)
+ case "search-forum":
+ handleSearchForumMessage(msg)
+ case "forum-results":
+ handleForumResultsMessage(msg)
+ case "text-results":
+ handleTextResultsMessage(msg)
+ case "image-results":
+ handleImageResultsMessage(msg)
+ case "video-results":
+ handleVideoResultsMessage(msg)
+ case "file-results":
+ handleFileResultsMessage(msg)
+ default:
+ printWarn("Received unknown message type: %v", msg.Type)
+ }
+}
diff --git a/open-search.go b/open-search.go
index b685e42..280fe3d 100644
--- a/open-search.go
+++ b/open-search.go
@@ -21,20 +21,22 @@ type URL struct {
}
func generateOpenSearchXML(config Config) {
+ baseURL := addProtocol(config.Domain)
+
opensearch := OpenSearchDescription{
Xmlns: "http://a9.com/-/spec/opensearch/1.1/",
- ShortName: "Ocásek",
+ ShortName: "Search Engine",
Description: "Search engine",
Tags: "search, engine",
URL: URL{
Type: "text/html",
- Template: fmt.Sprintf("https://%s/search?q={searchTerms}", config.OpenSearch.Domain),
+ Template: fmt.Sprintf("%s/search?q={searchTerms}", baseURL),
},
}
file, err := os.Create("static/opensearch.xml")
if err != nil {
- fmt.Println("Error creating OpenSearch file:", err)
+ printErr("Error creating OpenSearch file: %v", err)
return
}
defer file.Close()
@@ -42,9 +44,9 @@ func generateOpenSearchXML(config Config) {
enc := xml.NewEncoder(file)
enc.Indent(" ", " ")
if err := enc.Encode(opensearch); err != nil {
- fmt.Println("Error encoding OpenSearch XML:", err)
+ printErr("Error encoding OpenSearch XML: %v", err)
return
}
- fmt.Println("OpenSearch description file generated successfully.")
+ printInfo("OpenSearch description file generated successfully.")
}
diff --git a/printing.go b/printing.go
new file mode 100644
index 0000000..2cfd951
--- /dev/null
+++ b/printing.go
@@ -0,0 +1,50 @@
+package main
+
+import (
+ "fmt"
+ "time"
+)
+
+// printDebug logs debug-level messages when LogLevel is set to 4.
+func printDebug(format string, args ...interface{}) {
+ if config.LogLevel >= 4 {
+ logMessage("DEBUG", format, args...)
+ }
+}
+
+// printInfo logs info-level messages when LogLevel is set to 3 or higher.
+func printInfo(format string, args ...interface{}) {
+ if config.LogLevel >= 3 {
+ logMessage("INFO", format, args...)
+ }
+}
+
+// printWarn logs warning-level messages when LogLevel is set to 2 or higher.
+func printWarn(format string, args ...interface{}) {
+ if config.LogLevel >= 2 {
+ logMessage("WARN", format, args...)
+ }
+}
+
+// printErr logs error-level messages regardless of LogLevel.
+func printErr(format string, args ...interface{}) {
+ if config.LogLevel >= 1 {
+ logMessage("ERROR", format, args...)
+ }
+}
+
+// printMessage logs messages without a specific log level (e.g., general output).
+func printMessage(format string, args ...interface{}) {
+ logMessage("", format, args...)
+}
+
+// logMessage handles the actual logging logic without using the default logger's timestamp.
+func logMessage(level string, format string, args ...interface{}) {
+ timestamp := time.Now().Format("2006-01-02 15:04:05")
+ message := fmt.Sprintf(format, args...)
+ if level != "" {
+ fmt.Printf("[%s %s] %s\n", timestamp, level, message)
+ } else {
+ fmt.Printf("[%s] %s\n", timestamp, message)
+ }
+}
diff --git a/run.bat b/run.bat
new file mode 100755
index 0000000..2709b17
--- /dev/null
+++ b/run.bat
@@ -0,0 +1,31 @@
+@echo off
+setlocal enabledelayedexpansion
+
+rem Directory where the Go files are located
+set GO_DIR=C:\path\to\your\go\files
+
+rem Explicitly list the main files in the required order
+set FILES=main.go init.go search-engine.go text.go text-google.go text-librex.go text-brave.go text-duckduckgo.go common.go cache.go agent.go files.go files-thepiratebay.go files-torrentgalaxy.go forums.go get-searchxng.go imageproxy.go images.go images-imgur.go images-quant.go map.go node.go open-search.go video.go
+
+rem Change to the directory with the Go files
+pushd %GO_DIR%
+
+rem Find all other .go files that were not explicitly listed
+set OTHER_GO_FILES=
+
+for %%f in (*.go) do (
+ set file=%%~nxf
+ set found=0
+ for %%i in (%FILES%) do (
+ if /i "%%i"=="!file!" set found=1
+ )
+ if !found!==0 (
+ set OTHER_GO_FILES=!OTHER_GO_FILES! "%%f"
+ )
+)
+
+rem Run the Go program with the specified files first, followed by the remaining files
+go run %FILES% %OTHER_GO_FILES%
+
+rem Return to the original directory
+popd
diff --git a/run.sh b/run.sh
index da5bcae..6aa8efa 100755
--- a/run.sh
+++ b/run.sh
@@ -1,7 +1,35 @@
#!/bin/sh
-# Find all .go files in the current directory
-GO_FILES=$(find . -name '*.go' -print)
+# Explicitly list the main files in the required order
+FILES="
+./main.go
+./init.go
+./search-engine.go
+./text.go
+./text-google.go
+./text-librex.go
+./text-brave.go
+./text-duckduckgo.go
+./common.go
+./cache.go
+./agent.go
+./files.go
+./files-thepiratebay.go
+./files-torrentgalaxy.go
+./forums.go
+./get-searchxng.go
+./imageproxy.go
+./images.go
+./images-imgur.go
+./images-quant.go
+./map.go
+./node.go
+./open-search.go
+./video.go
+"
-# Run the Go program
-go run $GO_FILES
+# Find all other .go files that were not explicitly listed
+OTHER_GO_FILES=$(find . -name '*.go' ! -name 'main.go' ! -name 'init.go' ! -name 'search-engine.go' ! -name 'text.go' ! -name 'text-google.go' ! -name 'text-librex.go' ! -name 'text-brave.go' ! -name 'text-duckduckgo.go' ! -name 'common.go' ! -name 'cache.go' ! -name 'agent.go' ! -name 'files.go' ! -name 'files-thepiratebay.go' ! -name 'files-torrentgalaxy.go' ! -name 'forums.go' ! -name 'get-searchxng.go' ! -name 'imageproxy.go' ! -name 'images.go' ! -name 'images-imgur.go' ! -name 'images-quant.go' ! -name 'map.go' ! -name 'node.go' ! -name 'open-search.go' ! -name 'video.go' -print)
+
+# Run the Go program with the specified files first, followed by the remaining files
+go run $FILES $OTHER_GO_FILES
diff --git a/search-engine.go b/search-engine.go
index 36dde9a..1396b4e 100644
--- a/search-engine.go
+++ b/search-engine.go
@@ -1,13 +1,18 @@
package main
import (
+ "encoding/json"
+ "fmt"
+ "log"
"math/rand"
+ "net/http"
"sync"
"time"
)
var (
searchEngineLock sync.Mutex
+ searchEngines []SearchEngine // Ensure this variable is defined
)
// SearchEngine struct now includes metrics for calculating reputation.
@@ -19,11 +24,23 @@ type SearchEngine struct {
TotalTime time.Duration
SuccessfulSearches int
FailedSearches int
+ IsCrawler bool // Indicates if this search engine is a crawler
+ Host string // Host of the crawler
+ Port int // Port of the crawler
+ AuthCode string // Auth code for the crawler
}
// init function seeds the random number generator.
func init() {
rand.Seed(time.Now().UnixNano())
+ // Initialize the searchEngines list
+ searchEngines = []SearchEngine{
+ {Name: "Google", Func: wrapTextSearchFunc(PerformGoogleTextSearch), Weight: 1},
+ {Name: "LibreX", Func: wrapTextSearchFunc(PerformLibreXTextSearch), Weight: 2},
+ {Name: "Brave", Func: wrapTextSearchFunc(PerformBraveTextSearch), Weight: 2},
+ {Name: "DuckDuckGo", Func: wrapTextSearchFunc(PerformDuckDuckGoTextSearch), Weight: 5},
+ // {Name: "SearXNG", Func: wrapTextSearchFunc(PerformSearXNGTextSearch), Weight: 2}, // Uncomment when implemented
+ }
}
// Selects a search engine based on weighted random selection with dynamic weighting.
@@ -88,3 +105,47 @@ func calculateReputation(engine SearchEngine) int {
// Scale reputation for better interpretability (e.g., multiply by 10)
return int(reputation * 10)
}
+
+func fetchSearchResults(query, safe, lang, searchType string, page int) []SearchResult {
+ var results []SearchResult
+
+ engine := selectSearchEngine(searchEngines)
+ log.Printf("Using search engine: %s", engine.Name)
+
+ if engine.IsCrawler {
+ searchResults, duration, err := fetchSearchFromCrawler(engine, query, safe, lang, searchType, page)
+ updateEngineMetrics(&engine, duration, err == nil)
+ if err != nil {
+ log.Printf("Error performing search with crawler %s: %v", engine.Name, err)
+ return nil
+ }
+ results = append(results, searchResults...)
+ } else {
+ searchResults, duration, err := engine.Func(query, safe, lang, page)
+ updateEngineMetrics(&engine, duration, err == nil)
+ if err != nil {
+ log.Printf("Error performing search with %s: %v", engine.Name, err)
+ return nil
+ }
+ results = append(results, searchResults...)
+ }
+
+ return results
+}
+
+func fetchSearchFromCrawler(engine SearchEngine, query, safe, lang, searchType string, page int) ([]SearchResult, time.Duration, error) {
+ url := fmt.Sprintf("http://%s:%d/search?q=%s&safe=%s&lang=%s&t=%s&p=%d", engine.Host, engine.Port, query, safe, lang, searchType, page)
+ start := time.Now()
+ resp, err := http.Get(url)
+ if err != nil {
+ return nil, 0, err
+ }
+ defer resp.Body.Close()
+
+ var results []SearchResult
+ if err := json.NewDecoder(resp.Body).Decode(&results); err != nil {
+ return nil, 0, err
+ }
+
+ return results, time.Since(start), nil
+}
diff --git a/static/css/black.css b/static/css/black.css
new file mode 100644
index 0000000..9661246
--- /dev/null
+++ b/static/css/black.css
@@ -0,0 +1,74 @@
+:root {
+ --html-bg: #000000;
+ --font-fg: #fafafa;
+ --fg: #BABCBE;
+
+ --search-bg: #000000;
+ --search-bg-input: #000000;
+ --search-bg-input-border: #5f6368;
+ --search-select: #282828;
+
+ --border: #707070;
+
+ --link: #8ab4f8;
+ --link-visited: #c58af9;
+
+ --snip-border: #303134;
+ --snip-background: #282828;
+ --snip-text: #f1f3f4;
+
+ --settings-border: #5f6368;
+ --button: #000000;
+
+ --footer-bg: #161616;
+ --footer-font: #999da2;
+
+ --highlight: #bcc0c3;
+
+ --blue: #8ab4f8;
+
+ --green: #31b06e;
+
+ --search-button: #BABCBE;
+
+ --image-view: #161616;
+ --image-view-titlebar: #161616;
+ --view-image-color: #000000;
+ --image-select: #303030;
+ --fff: #fff;
+
+ --publish-info: #7f869e;
+
+ color-scheme: dark;
+}
+
+.calc-btn:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2 {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc-btn {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
+}
+
+.view-image-search {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.view-image-search:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
diff --git a/static/css/dark.css b/static/css/dark.css
new file mode 100644
index 0000000..0555cd3
--- /dev/null
+++ b/static/css/dark.css
@@ -0,0 +1,74 @@
+:root {
+ --html-bg: #1c1c1c;
+ --font-fg: #f1f3f4;
+ --fg: #BABCBE;
+
+ --search-bg: #161616;
+ --search-bg-input: #333333;
+ --search-bg-input-border: #3C4043;
+ --search-select: #282828;
+
+ --border: #303134;
+
+ --link: #8ab4f8;
+ --link-visited: #c58af9;
+
+ --snip-border: #303134;
+ --snip-background: #282828;
+ --snip-text: #f1f3f4;
+
+ --settings-border: #5f6368;
+ --button: #333333;
+
+ --footer-bg: #161616;
+ --footer-font: #999da2;
+
+ --highlight: #bcc0c3;
+
+ --blue: #8ab4f8;
+
+ --green: #31b06e;
+
+ --search-button: #BABCBE;
+
+ --image-view: #161616;
+ --image-view-titlebar: #161616;
+ --view-image-color: #000000;
+ --image-select: #303030;
+ --fff: #fff;
+
+ --publish-info: #7f869e;
+
+ color-scheme: dark;
+}
+
+.calc-btn:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2 {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc-btn {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
+}
+
+.view-image-search {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.view-image-search:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
diff --git a/static/css/latte.css b/static/css/latte.css
new file mode 100644
index 0000000..b217da7
--- /dev/null
+++ b/static/css/latte.css
@@ -0,0 +1,100 @@
+:root {
+ --rosewater: #f5e0dc;
+ --flamingo: #f2cdcd;
+ --pink: #f5c2e7;
+ --mauve: #cba6f7;
+ --red: #f38ba8;
+ --maroon: #eba0ac;
+ --peach: #fab387;
+ --yellow: #f9e2af;
+ --green: #a6e3a1;
+ --teal: #94e2d5;
+ --sky: #89dceb;
+ --sapphire: #74c7ec;
+ --blue: #89b4fa;
+ --lavender: #b4befe;
+ --text: #cdd6f4;
+ --subtext1: #bac2de;
+ --subtext0: #a6adc8;
+ --overlay2: #9399b2;
+ --overlay1: #7f849c;
+ --overlay0: #6c7086;
+ --surface2: #585b70;
+ --surface1: #45475a;
+ --surface0: #313244;
+ --base: #1e1e2e;
+ --mantle: #181825;
+ --crust: #11111b;
+
+ --html-bg: var(--base);
+ --font-fg: var(--text);
+ --fg: var(--subtext0);
+
+ --search-bg: var(--mantle);
+ --search-bg-input: var(--surface1);
+ --search-bg-input-border: var(--overlay0);
+ --search-select: var(--surface0);
+
+ --border: var(--overlay0);
+
+ --link: var(--blue);
+ --link-visited: var(--mauve);
+
+ --snip-border: var(--surface1);
+ --snip-background: var(--surface0);
+ --snip-text: var(--text);
+
+ --settings-border: var(--overlay1);
+ --button: var(--surface1);
+
+ --footer-bg: var(--mantle);
+ --footer-font: var(--overlay1);
+
+ --highlight: var(--subtext1);
+
+ --blue: var(--blue);
+ --green: var(--green);
+
+ --search-button: var(--subtext0);
+
+ --image-view: var(--mantle);
+ --image-view-titlebar: var(--mantle);
+ --view-image-color: var(--crust);
+ --image-select: var(--surface1);
+ --fff: var(--text);
+
+ --publish-info: var(--overlay2);
+
+ color-scheme: dark;
+}
+
+.calc-btn:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2 {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc-btn {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
+}
+
+.view-image-search {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.view-image-search:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
diff --git a/static/css/light.css b/static/css/light.css
new file mode 100644
index 0000000..656f89a
--- /dev/null
+++ b/static/css/light.css
@@ -0,0 +1,72 @@
+:root {
+ --html-bg: #ffffff;
+ --font-fg: #000000;
+ --fg: #202124;
+
+ --search-bg: #ffffff;
+ --search-bg-input: #f6f6f6;
+ --search-bg-input-border: #dadce0;
+ --search-select: #eeeeee;
+
+ --border: #dadce0;
+
+ --link: #1a0dab;
+ --link-visited: #681da8;
+
+ --snip-border: #dadce0;
+ --snip-background: #ffffff;
+ --snip-text: #000000;
+
+ --settings-border: #5f6368;
+ --button: #f6f6f6;
+
+ --footer-bg: #f6f6f6;
+ --footer-font: #353535;
+
+ --highlight: #202124;
+
+ --blue: #4285f4;
+
+ --green: #202124;
+
+ --image-view: #ffffff;
+ --image-view-titlebar: #ffffff;
+ --view-image-color: #f1f3f4;
+ --image-select: #f6f6f6;
+ --fff: #fff;
+
+ --publish-info: #202124;
+
+ --search-button: #202124;
+}
+
+.wrapper-results:hover,
+.wrapper-results:focus-within,
+.wrapper:hover,
+.wrapper:focus-within {
+ box-shadow: 0px 2px 4px rgba(0, 0, 0, 0.08) !important;
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1) !important;
+}
+
+.check p {
+ color: var(--highlight) !important;
+}
+
+.image_view {
+ box-shadow: 0px 2px 4px rgba(0, 0, 0, 0.08) !important;
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1) !important;
+}
+
+.search-menu {
+ box-shadow: 0px 2px 4px rgba(0, 0, 0, 0.08) !important;
+}
+
+.view-image-search {
+ box-shadow: none;
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1) !important;
+}
+
+.view-image-search:hover {
+ box-shadow: 0px 2px 4px rgba(0, 0, 0, 0.08) !important;
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1) !important;
+}
diff --git a/static/css/mocha.css b/static/css/mocha.css
new file mode 100644
index 0000000..543332f
--- /dev/null
+++ b/static/css/mocha.css
@@ -0,0 +1,100 @@
+:root {
+ --rosewater: #dc8a78;
+ --flamingo: #dd7878;
+ --pink: #ea76cb;
+ --mauve: #8839ef;
+ --red: #d20f39;
+ --maroon: #e64553;
+ --peach: #fe640b;
+ --yellow: #df8e1d;
+ --green: #40a02b;
+ --teal: #179299;
+ --sky: #04a5e5;
+ --sapphire: #209fb5;
+ --blue: #1e66f5;
+ --lavender: #7287fd;
+ --text: #4c4f69;
+ --subtext1: #5c5f77;
+ --subtext0: #6c6f85;
+ --overlay2: #7c7f93;
+ --overlay1: #8c8fa1;
+ --overlay0: #9ca0b0;
+ --surface2: #acb0be;
+ --surface1: #bcc0cc;
+ --surface0: #ccd0da;
+ --base: #eff1f5;
+ --mantle: #e6e9ef;
+ --crust: #dce0e8;
+
+ --html-bg: var(--base);
+ --font-fg: var(--text);
+ --fg: var(--subtext0);
+
+ --search-bg: var(--mantle);
+ --search-bg-input: var(--surface1);
+ --search-bg-input-border: var(--overlay0);
+ --search-select: var(--surface0);
+
+ --border: var(--overlay0);
+
+ --link: var(--blue);
+ --link-visited: var(--mauve);
+
+ --snip-border: var(--surface1);
+ --snip-background: var(--surface0);
+ --snip-text: var(--text);
+
+ --settings-border: var(--overlay1);
+ --button: var(--surface1);
+
+ --footer-bg: var(--mantle);
+ --footer-font: var(--overlay1);
+
+ --highlight: var(--subtext1);
+
+ --blue: var(--blue);
+ --green: var(--green);
+
+ --search-button: var(--subtext0);
+
+ --image-view: var(--mantle);
+ --image-view-titlebar: var(--mantle);
+ --view-image-color: var(--crust);
+ --image-select: var(--surface1);
+ --fff: var(--text);
+
+ --publish-info: var(--overlay2);
+
+ color-scheme: light;
+}
+
+.calc-btn:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.15), 0 10px 10px rgba(0, 0, 0, 0.12);
+}
+
+.calc-btn-2:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.15), 0 10px 10px rgba(0, 0, 0, 0.12);
+}
+
+.calc-btn-2 {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.10), 0 1px 2px rgba(0, 0, 0, 0.14);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc-btn {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.10), 0 1px 2px rgba(0, 0, 0, 0.14);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.15);
+}
+
+.view-image-search {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.10), 0 1px 2px rgba(0, 0, 0, 0.14);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.view-image-search:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.15), 0 10px 10px rgba(0, 0, 0, 0.12);
+}
diff --git a/static/css/night.css b/static/css/night.css
new file mode 100644
index 0000000..0676672
--- /dev/null
+++ b/static/css/night.css
@@ -0,0 +1,74 @@
+:root {
+ --html-bg: #171b25;
+ --font-fg: #ebecf7;
+ --fg: #ebecf7;
+
+ --search-bg: #0c0d0f;
+ --search-bg-input: #2e3443;
+ --search-bg-input-border: rgb(46, 52, 67);
+ --search-select: #3a445c;
+
+ --border: rgb(46, 52, 67);
+
+ --link: #a7b1fc;
+ --link-visited: #ad71bc;
+
+ --snip-border: rgb(46, 52, 67);
+ --snip-background: #1e222d;
+ --snip-text: #f1f3f4;
+
+ --settings-border: #5f6368;
+ --button: #0c0d0f;
+
+ --footer-bg: #0c0d0f;
+ --footer-font: #ebecf7;
+
+ --highlight: #ebecf7;
+
+ --blue: #8ab4f8;
+
+ --green: #31b06e;
+
+ --image-view: #0c0d0f;
+ --image-view-titlebar: #0c0d0f;
+ --view-image-color: #000000;
+ --image-select: #303030;
+ --fff: #fff;
+
+ --search-button: #BABCBE;
+
+ --publish-info: #7f869e;
+
+ color-scheme: dark;
+}
+
+.calc-btn:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
+
+.calc-btn-2 {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc-btn {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.calc {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.5);
+}
+
+.view-image-search {
+ box-shadow: 0 1px 3px rgba(0, 0, 0, 0.12), 0 1px 2px rgba(0, 0, 0, 0.24);
+ transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
+}
+
+.view-image-search:hover {
+ box-shadow: 0 14px 28px rgba(0, 0, 0, 0.25), 0 10px 10px rgba(0, 0, 0, 0.22);
+}
diff --git a/static/css/style-settings.css b/static/css/style-settings.css
new file mode 100644
index 0000000..0dda333
--- /dev/null
+++ b/static/css/style-settings.css
@@ -0,0 +1,62 @@
+
+/* settings.html */
+
+.theme-link {
+ display: block;
+ text-decoration: none;
+ color: inherit;
+ width: 48%;
+ margin-bottom: 10px;
+ height: 150px;
+ position: relative; /* Make it possible to position the tooltip */
+}
+
+.theme-link img {
+ width: 100%;
+ height: 100%;
+ object-fit: cover;
+ border-radius: 4px;
+ border: 1px solid var(--snip-border);
+ transition: border-color 0.3s ease;
+}
+
+/* .theme-link:hover img {
+ border-color: var(--highlight);
+} */
+
+.theme-tooltip {
+ display: none; /* Hidden by default */
+ position: absolute;
+ bottom: 10px; /* Position at the bottom of the image */
+ left: 50%;
+ transform: translateX(-50%);
+ background-color: rgba(0, 0, 0, 0.7); /* Semi-transparent background */
+ color: #fff;
+ padding: 5px 10px;
+ border-radius: 4px;
+ font-size: 14px;
+ white-space: nowrap;
+}
+
+.theme-link:hover .theme-tooltip {
+ display: block; /* Show tooltip on hover */
+}
+
+.themes-settings-menu {
+ display: flex;
+ flex-wrap: wrap;
+ justify-content: space-between;
+ background: var(--snip-background);
+ color: var(--fg);
+ border-radius: 4px;
+ padding: 10px;
+ gap: 10px;
+}
+
+@media (max-width: 600px) {
+ .theme-link {
+ width: 100%;
+ }
+}
+
+/* --- */
diff --git a/static/css/style.css b/static/css/style.css
index 33fb56b..a132aaa 100644
--- a/static/css/style.css
+++ b/static/css/style.css
@@ -120,6 +120,7 @@
}
.clean {
+ margin-top: 30px;
max-width: 775px;
}
@@ -483,6 +484,16 @@ hr {
font-size: 16px;
}
+.video_title a {
+ color: var(--link);
+ text-decoration: none;
+}
+
+.video_title a:hover {
+ color: var(--link);
+ text-decoration: underline;
+}
+
.video-results-margin {
margin-bottom: 0px !important;
}
@@ -546,7 +557,7 @@ hr {
left: 0;
right: 0;
z-index: 2;
- border: 2px solid var(--search-bg-input-border);
+ border: 1px solid var(--search-bg-input-border);
}
.wrapper input {
@@ -679,7 +690,8 @@ hr {
}
.kno_wiki_show {
- display: initial !important;
+ display: initial !important;
+ border-radius: 6px;
}
.open-in-new-tab * {
@@ -786,6 +798,7 @@ form.torrent-sort {
transition: all 0.3s cubic-bezier(.25, .8, .25, 1);
border: 1px solid var(--border);
border-radius: 10px;
+ top: 24px;
}
.search-menu h2 {
@@ -841,7 +854,6 @@ form.torrent-sort {
margin: 5px;
}
-
.view-image-search {
border: 1px solid var(--snip-border);
margin: 0;
@@ -1154,7 +1166,7 @@ p {
background-color: inherit;
font-size: 14px;
font-family: 'Inter';
- margin-right: 10px;
+ margin-right: 14px;
color: var(--search-button);
margin-top: 72px;
padding-bottom: 11px;
@@ -1208,6 +1220,10 @@ p {
margin-top: -2px;
}
+.link {
+ color: var(--link);
+}
+
.results a:visited h3,
.result_sublink a:visited h3 {
color: var(--link-visited);
@@ -1226,6 +1242,7 @@ p {
.results p,
.result_sublink p {
margin-top: 0px;
+ color: var(--font-fg);
}
.results a,
@@ -1235,6 +1252,7 @@ p {
.result_sublink a:hover,
.result_sublink a:visited {
text-decoration: none;
+ /* color: #ebecf7; */
font-size: 14px;
}
@@ -1277,6 +1295,12 @@ p {
font-size: 22px;
}
+.wiki_known_for {
+ margin: 0px;
+ font-weight: normal;
+ margin-bottom: 10px;
+}
+
.snip img {
float: right;
max-width: 30%;
@@ -1285,8 +1309,7 @@ p {
margin-left: 10px;
}
-.snip a {
- display: block;
+.snip .wiki_link {
margin-top: 10px;
text-decoration: none;
color: var(--link);
@@ -1300,6 +1323,28 @@ p {
text-decoration: underline;
}
+.snip .about {
+ font-size: 18px;
+ margin-top: 10px;
+ margin-bottom: 5px;
+}
+
+.factpoint {
+ color: var(--fg);
+ font-weight: bold;
+ vertical-align: text-top;
+ text-align: left;
+ padding-right: 14px;
+}
+
+.fact a {
+ color: var(--link);
+ text-decoration: none;
+}
+.fact a:visited {
+ color: var(--link-visited);
+}
+
.snipp {
padding: 10px;
border-bottom: 1px solid var(--border);
@@ -1680,6 +1725,7 @@ body, h1, p, a, input, button {
margin-top: 0px;
top: 0px;
left: 0px;
+ /* background-color: var(--search-bg); */
}
.mobile-none {
@@ -1826,8 +1872,6 @@ body, h1, p, a, input, button {
}
-/* --- */
-
/* Ensuring dark theme compliance */
@media (prefers-color-scheme: dark) {
.leaflet-control-locate,
@@ -1849,8 +1893,7 @@ body, h1, p, a, input, button {
color: var(--link) !important;
}
}
-
-/* Variables for light theme */
+/*
:root {
--background-color: #ffffff;
--text-color: #000000;
@@ -1879,7 +1922,6 @@ body, h1, p, a, input, button {
--box-shadow: #00000020;
}
-/* Styles for dark theme */
@media (prefers-color-scheme: dark) {
:root {
--background-color: #202124;
@@ -1908,4 +1950,4 @@ body, h1, p, a, input, button {
--green: #8ab4f8;
--box-shadow: #ffffff20;
}
-}
+} */
diff --git a/static/images/black.webp b/static/images/black.webp
new file mode 100644
index 0000000..44386b5
Binary files /dev/null and b/static/images/black.webp differ
diff --git a/static/images/dark.webp b/static/images/dark.webp
new file mode 100644
index 0000000..ec0a27c
Binary files /dev/null and b/static/images/dark.webp differ
diff --git a/static/images/latte.webp b/static/images/latte.webp
new file mode 100644
index 0000000..146d583
Binary files /dev/null and b/static/images/latte.webp differ
diff --git a/static/images/light.webp b/static/images/light.webp
new file mode 100644
index 0000000..be65002
Binary files /dev/null and b/static/images/light.webp differ
diff --git a/static/images/mocha.webp b/static/images/mocha.webp
new file mode 100644
index 0000000..5940ac4
Binary files /dev/null and b/static/images/mocha.webp differ
diff --git a/static/images/night.webp b/static/images/night.webp
new file mode 100644
index 0000000..3329268
Binary files /dev/null and b/static/images/night.webp differ
diff --git a/templates/files.html b/templates/files.html
old mode 100644
new mode 100755
index 67ed60d..af39437
--- a/templates/files.html
+++ b/templates/files.html
@@ -1,112 +1,113 @@
-
-
-
-
-
- {{.Query}} - Ocásek
-
-
-
-
-
-
- Fetched in {{ .Fetched }} seconds
-
- {{ if .Results }}
-
-
- {{ range .Results }}
-
- {{ if .Error }}
-
{{ .Error }}
- {{ else }}
-
{{ .URL }}
-
{{ .Title }}
-
{{ if .Views }}{{ .Views }} views • {{ end }}{{ .Size }}
-
Seeders: {{ .Seeders }} | Leechers: {{ .Leechers }}
- {{ end }}
-
- {{ end }}
-
-
-
-
- {{ else }}
-
- Your search '{{ .Query }}' came back with no results.
- Try rephrasing your search term and/or recorrect any spelling mistakes.
-
- {{ end }}
-
-
-
+
+
+
+
+
+ {{.Query}} - Ocásek
+
+
+
+
+
+
+
+ Fetched in {{ .Fetched }} seconds
+
+ {{ if .Results }}
+
+
+ {{ range .Results }}
+
+ {{ if .Error }}
+
{{ .Error }}
+ {{ else }}
+
{{ .URL }}
+
{{ .Title }}
+
{{ if .Views }}{{ .Views }} views • {{ end }}{{ .Size }}
+
Seeders: {{ .Seeders }} | Leechers: {{ .Leechers }}
+ {{ end }}
+
+ {{ end }}
+
+
+
+
+ {{ else }}
+
+ Your search '{{ .Query }}' came back with no results.
+ Try rephrasing your search term and/or recorrect any spelling mistakes.
+
+ {{ end }}
+
+
+
diff --git a/templates/forums.html b/templates/forums.html
old mode 100644
new mode 100755
index c7752aa..0246110
--- a/templates/forums.html
+++ b/templates/forums.html
@@ -1,91 +1,92 @@
-
-
-
-
-
- {{.Query}} - Ocásek
-
-
-
-
-
-
-
- {{if .Results}}
- {{range .Results}}
-
-
- {{end}}
- {{else}}
-
No results found for '{{ .Query }}'. Try different keywords.
- {{end}}
-
-
-
-
-
-
-
+
+
+
+
+
+ {{.Query}} - Ocásek
+
+
+
+
+
+
+
+
+ {{if .Results}}
+ {{range .Results}}
+
+
+ {{end}}
+ {{else}}
+
No results found for '{{ .Query }}'. Try different keywords.
+ {{end}}
+
+
+
+
+
+
+
diff --git a/templates/images.html b/templates/images.html
old mode 100644
new mode 100755
index 9ef54fe..ee3794a
--- a/templates/images.html
+++ b/templates/images.html
@@ -1,164 +1,169 @@
-
-
-
-
-
- {{.Query}} - Ocásek
-
-
-
-
-
-
-
-
- {{ if .Results }}
-
-
- {{ range .Results }}
-
- {{ end }}
-
-
- {{ else if .NoResults }}
-
No results found for '{{ .Query }}'. Try different keywords.
- {{ else }}
-
Looks like this is the end of results.
- {{ end }}
-
-
- Searching for new results...
-
-
-
-
+
+
+
+
+
+ {{.Query}} - Ocásek
+
+
+
+
+
+
+
+
+
+ {{ if .Results }}
+
+
+ {{ range .Results }}
+
+ {{ end }}
+
+
+ {{ else if .NoResults }}
+
No results found for '{{ .Query }}'. Try different keywords.
+ {{ else }}
+
Looks like this is the end of results.
+ {{ end }}
+
+
+ Searching for new results...
+
+
+
+
+
diff --git a/templates/map.html b/templates/map.html
index f698229..dedcb37 100644
--- a/templates/map.html
+++ b/templates/map.html
@@ -5,6 +5,7 @@
{{ .Query }} - Ocásek
+
diff --git a/templates/search.html b/templates/search.html
old mode 100644
new mode 100755
index 4cc6ab8..1300d4e
--- a/templates/search.html
+++ b/templates/search.html
@@ -1,28 +1,95 @@
-
-
-
-
-
- Search with Ocásek
-
-
-
-
-
-
-
-
+
+
+
+
+
+ Search with Ocásek
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/templates/settings.html b/templates/settings.html
index b484e7e..73843f6 100644
--- a/templates/settings.html
+++ b/templates/settings.html
@@ -4,78 +4,87 @@
Settings - Ocásek
-
+
+
+
-