Hacker Newsnew | past | comments | ask | show | jobs | submit | egberts1's commentslogin

X Window Release 3 (X11R3) was introduced on Cray into UNICOS (a UNIX variant of Cray OS, COS) in late 1989 using ported 64-bit Xlib. But it was not widely used within small Cray community.

But MIT cooked up X11 "PROTOCOL" of Xlib in late 1985 to 1986 on Univac and Unix in C with many other X libraries written in Common Lisp.

X10R3 mostly stabilized the Xlib around a few platforms and CPU architecture (DDX) in a"long" preparation for X11R1 in September 1987.

https://www.x.org/wiki/X11R1/?utm_source=chatgpt.com


Some basic questions from a cybersecurity vulnerability researcher:

- what kind of authentication protocol stack is used

- what algorithm is used for network protocol encryption (hash, block, encryption)

- is data centrally stored, if so, is it encrypted at rest? Key stays in phones?

- any accounting audit done? (Moot but just a check mark in a small-family-business-oriented checkbox)

Great pricing!!


It is the intake-only transport of software/LLM through a waterfall-type security that is bringing LLM (and related analysis tools, RAG, HYDE) into the SCIF.

Nothing is taken out except product results with appropriate classification level.

Then I saw "Newsweek" for what it is, a yellow journalistic rag corporation that was bought for exactly $1.00 USD by a billionaire car radio magnate as a gift for his wife, a radical ex-US-Congresswoman so she can prop her party up.

https://www.forbes.com/sites/williampbarrett/2010/10/07/for-...


If it weren't for Internet infrastructure hobbling SCTP (via firewall), SCTP provides the same QUICC (session multiplexing) within same 5-tuple and with way much lower packet overhead and smaller code base too.

As with any network protocol design, the tradeoff is slighty gained from versatility over loss of privacy. So it depends on your triage of needs: security, privacy, confidentiality.

Now with the latest "quadage", unobservability (plausible deniability).


From what I recall, one downside to SCTP is that things like resuming from different IP addresses and arbitrarily changing the amount of connections per socket didn't work well in standard SCTP. Plus the TLS story isn't as easy. QUIC makes that stuff easier to work with from an application perspective.

Still a fascinating protocol, doomed to be used exclusively as a weird middle layer for websockets and as a carrier protocol for internal telco networks.


Unfortunately most of the existing communication protocols that are standardized conform to a broken model of networking where security is not provided by the network layer.

Cryptography can't be thought of as an optional layer that people might want to turn on. That bad idea shows up in many software systems. It needs to be thought of as a tool to ensure that a behavior is provided reliably. In this case, that the packets are really coming from who you think they are coming from. There is no reason to believe that they are without cryptography. It's not optional; it's required to provide the quality of service that the user is expecting.

DTLS and QUIC both immediately secure the connection. QUIC then goes on to do its stream multiplexing. The important thing is that the connection is secured in (or just above) the network layer. Had OSI (or whoever else) gotten that part right, then all of these protocols, like SCTP, would actually be useful.


It all started with all grants and department expenditures being tagged with the employee ID, of supervisory.

Latest 2024 budget expenses, a fairly good percentage were chocked with no ID, no supervisor or delgated authority.

Better now, no ID, no money from Treasury.


Entry vector is via user-loadable kernel modules.

Does not work if kernel Kconfig setting has:

    CONFIG_MODULES=n
All deliverables should have this Kconfig setting disabled.

> CONFIG_MODULES=n

does this work on normal linux desktops ? My impression was that either: 1). Kernel is too big. Try making modules - link error or 2) System will not boot due to missing/misconfigured parts.


The sole blocker of CONFIG_MODULES=n is WiFi and only just prior to network UP state of WiFi (during initial WiFi initialization).

Also, kernel build will fail during 'make modules'/'make all'/'make' but will succeed for 'make bzImage'/'make install'

Desktop Linux distros' WiFi required SIGNED module support for internationalization of radio band selection.

SO, for kernel modules to be disabled on desktop and still use WiFi, one needs to rebuild WiFi without module support and specifically to comply with their country's radio authority.

Pesky little thing.


Many embedded systems or supercomputers disable modules for security or simplicity, but then all needed drivers must be built in. WiFi is a common casualty because it’s normally modular due to firmware blobs provided as-is from WiFi manufacturers.

Also, many supercomputing facilities and hardened servers prohibits direct networking with WiFi drivers (because, unverifiable firmware blobs).

Your homelab should provide the direct Ethernet connect to your desktop.


Could always try:

echo 1 > /proc/sys/kernel/modules_disabled

Which is supposed to block dynamic loading modules until a reboot. =3


This is not permanent; if the system is rebooted, it will be undone :)

Kernel bootline can be uodated to include this option:

    modules_disabled


I hate to break it to you, but that video’s from Chile, as it says in the chat.

See for yourself: https://maps.app.goo.gl/BctQ7hV798imS4cF8?g_st=ipc


Time to primary him


Ro is a man.


So, a clueless Luddite like me went and pulled the temperature reading database since 1903 and did the following:

- search for all United States - excluded all 14,200 US stations installed since 1973 - average/median them the rest

Trend? Largely flat.

Did I jump the gun?

Were excluded stations like number 80238 (Arcadia, FL) being installed near heat-producing objects (air conditioning condenser unit, nuclear cooling towers, new parking lots) the cause? A valid cause?


I have many questions about your methodology:

- How many stations were you left with?

- Did that number decline over time (as you excluded replacement stations)?

- What was your scale?

- Are you willing to share your results?

- How is this still the top comment after 30 minutes?

But your comment touches on a common misconception, which is that heat islands must be excluded to accurately measure the overall temperature. You refer to the idea as "heat-producing objects", but I would argue that a parking lot is more of a heat reflecting object. More to the point, even heat islands must be considered as part of the worldwide climate, simply because they are part of this wide world. Their heat does not simply disappear (I hope you agree that would violate physics).

Imagine we want to measure the average temperature inside a single 30-foot by 10-foot room during winter. We have two probes: one near a burning fireplace on one end of the room, and one near a window on the other. If we excluded data from one probe or the other, do you believe we would get an accurate average reading?

Of course, when scientists are calculating a global temperature, they have to handle special cases in the data (like heat islands). This has been known for some time, and you can read more about it here: https://www.realclimate.org/index.php/archives/2007/07/no-ma...

I fear that I've spent too much time responding to this, but I wanted to take it on in earnest.


May i suggest you consult the website realclimate.org? this and many other classic denialist tropes are well addressed there. Thank You!


Care to paraphrase the relevant explanation?


https://www.realclimate.org/index.php/archives/2007/07/no-ma...

Basically, it's something that's taken into account. The two main ways are calibrating urban stations relative to nearby rural stations, and by looking at the variation between windy and calm days, since the effects are larger on calm days (so if things are corrected well, there shouldn't be a difference)


Please no, because this would be feeding the sea lions.


It's not something that climatologists and meteorologists are unaware of. Even for longer-running weather stations you can get a heat island effect if they're in a city and it's built up over time. So while I can't say exactly why you got the result you did with this specific query and dataset it is something that is taken into account when analyzing this kind of data.


Wait so, in response to an article about the current year being hot, you excluded the past 50 years and made a conclusion that things are flat. Am I missing something from your post?


Not excluding the 50 years. Excluding stations installed then. Older stations still report. You have to tune the number right. If you choose 40 years it doesn’t work. So something must have happened in 1973.


Show me the source, Luke!


The other thing is that people use the classic chart trick of not starting the Y axis from zero Kelvin (the only scientific scale). If you do that you see that even including all these stations the trend is imperceptible.


On that scale, the difference between normal human body temperature and dangerous hyperthermia is just about imperceptible too. Even the difference between summer and winter is pretty small. Dunno why we bother with heating and air conditioning and having two sets of clothing.


Reminds me of that silly string theory first surfaced (under my Christmas tree) some 51 years ago.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: